EduPolicy.ai — Students Edition
FRAME MODE
EDITOR MODE — GATING DISABLED

Module 4

Privacy & Surveillance

Sections
All Modules
01 — What is AI?02 — The Ethics Problem03 — Bias & Fairness04 — Privacy & Surveillance05 — AI in the Workplace06 — Governance & Regulation07 — Misinformation & Deepfakes08 — Your AI Ethics Position

⚙ Instructor Settings

— OR upload —
No file selected80%
✓ Saved

Welcome to AI Ethics

Enter your information to begin Module 4. Your name personalizes your experience and appears on your completion certificate.

🎓

Privacy & Surveillance

Students Edition

You are being watched right now. Your phone tracks your location 24/7. Your browser records every search. Your email is scanned for advertising data. AI made all of this not just possible, but profitable. This module examines what we've lost, what we're losing, and whether we can get any of it back.

Module 4 of 8 — AI Ethics for Higher Education
Powered by EduPolicy.ai
Part 1

Surveillance Before and After AI

Surveillance isn't new. Governments have monitored citizens for centuries. What's new is the scale, persistence, and intelligence that AI brings to it.

TRADITIONAL SURVEILLANCE

Human-Limited

Requires a person watching a camera feed, reading intercepted mail, or following a subject. Limited by human attention, fatigue, and budget. A city with 1,000 cameras needs hundreds of officers to watch them. Most footage is never reviewed.

AI-POWERED SURVEILLANCE

Infinitely Scalable

One AI system monitors 1,000 cameras simultaneously. It never gets tired. It identifies faces, tracks movements, detects "anomalous behavior," and cross-references against databases — in real time. It can retroactively search weeks of stored footage in seconds.

What Changed
Scale
Millions of faces processed per second
24/7
Persistence
Never sleeps, never takes a break, never forgets
Retroactivity
Can search past data for patterns that weren't suspicious at the time
The Core Shift

Traditional surveillance asks: "Is this specific person doing something suspicious right now?" AI surveillance asks: "What has everyone been doing, everywhere, all the time?" That's not a difference in degree. It's a difference in kind.

Part 2

Your Data Trail: What You Generate Without Realizing It

Most people think of surveillance as cameras and wiretaps. The reality is that you generate surveillance data voluntarily, continuously, through devices you carry willingly. Tap each activity to see what data it creates.

📱
Checking the weather on your phone
Your weather app collects: precise GPS location, time of check, device ID, IP address, and often sells this to data brokers. One study found weather apps are among the most aggressive location data collectors. Your "innocent" weather check tells advertisers where you live, where you work, and what route you take between them.
🛒
Buying groceries with a loyalty card
Purchase history reveals: dietary restrictions (religious affiliation), alcohol consumption (health risk), pregnancy (Target famously predicted a teenager's pregnancy from her purchases before her father knew), household size, income level, and health conditions. Insurance companies have purchased grocery data to adjust premiums.
🏃
Using a fitness tracker
Heart rate patterns, sleep quality, exercise frequency, GPS routes of your runs, menstrual cycle tracking, stress levels. Fitbit's privacy policy allows sharing "de-identified" data with third parties. In 2018, Strava's heatmap accidentally revealed the locations of secret U.S. military bases because soldiers wore fitness trackers while jogging.
🗣️
Talking near a smart speaker
Amazon confirmed that Alexa recordings are reviewed by human employees. In 2019, a Bloomberg investigation found thousands of Amazon workers listening to recordings from Echo devices, including conversations users assumed were private. Smart speakers are always listening for their wake word — which means they're always processing audio.
👍
Scrolling social media for 30 minutes
Every post you pause on, every video you watch to the end, every link you almost click but don't — it's all tracked. Facebook's internal research showed they can predict your personality traits, political affiliation, sexual orientation, and drug use from your likes alone. A 2015 study found that with 300 likes, an algorithm knew you better than your spouse.
📧
Sending an email
Gmail scans email content for ad targeting (they stopped in 2017 for Gmail but continue for Google Workspace). Email metadata — who you emailed, when, how often — reveals your social network, work relationships, and communication patterns. The NSA collected email metadata on millions of Americans under the PRISM program.
The Privacy Paradox

You just tapped each item to see what data it collects — but you probably won't stop doing any of these things. That's the privacy paradox: people consistently say they value privacy while behaving in ways that sacrifice it. We trade our data for convenience because the cost feels invisible and the benefit feels immediate.

Part 3

Biometric Data: The Data You Can't Change

Not all personal data is equal. If your password is stolen, you change it. If your credit card is compromised, you get a new one. But what happens when the data that's compromised IS your body?

👆
FINGERPRINTS

Unique. Permanent. Stored by phone manufacturers, border agencies, and law enforcement globally. Once compromised, compromised forever.

👤
FACIAL GEOMETRY

128+ measurements of your face. Collected without consent by cameras in stores, airports, stadiums. You can't opt out of having a face.

👁️
IRIS PATTERNS

More unique than fingerprints. Used in border control and increasingly in commercial authentication. Worldcoin scanned irises of millions in developing nations in exchange for cryptocurrency.

Why Biometric Data Is Different
Regular data breach:

Password leaked → change password. Credit card stolen → new card issued. SSN compromised → credit monitoring. The damage is containable.

Biometric data breach:

Facial geometry leaked → you can't get a new face. Fingerprints compromised → your fingerprints are compromised for life. There is no reset button.

Illinois BIPA

Illinois was the first U.S. state to pass biometric privacy legislation (BIPA, 2008). It requires informed consent before collecting biometric data and allows individuals to sue for violations. Facebook paid $650 million in 2021 to settle a BIPA class-action lawsuit for using facial recognition on user photos without consent. This single law has done more to constrain biometric data collection than any federal regulation.

Part 4

Case Study: Clearview AI — The Company That Scraped Your Face

In January 2020, the New York Times revealed that a small startup called Clearview AI had built the most powerful facial recognition tool in the world — by scraping more than 30 billion photos from social media platforms without anyone's consent.

2017-2019
Clearview AI quietly scrapes billions of photos from Facebook, Instagram, LinkedIn, Twitter, Venmo, and millions of other websites. No user consented. No platform authorized it.
JAN 2020
New York Times investigation reveals Clearview's existence. The app lets law enforcement upload any photo and instantly match it against the database of 30+ billion images, returning the person's name, social media profiles, and associated information.
FEB 2020
Clearview's client list is breached. It reveals that more than 2,200 law enforcement agencies, including the FBI and DHS, had been using the tool — many without public knowledge or departmental authorization.
2020-2022
Facebook, Google, LinkedIn, Twitter, and YouTube send cease-and-desist letters demanding Clearview stop scraping. Clearview argues the First Amendment protects its right to collect publicly available images.
2022
ACLU settles a lawsuit: Clearview banned from selling its database to private companies in the U.S. But it can still sell to law enforcement. The UK and Australia fine Clearview for violating privacy laws. The company continues operating.
2024
Clearview claims 40+ billion images in its database. It pitches the Pentagon for military applications. The fundamental question remains unresolved: can a company build an identification system from your public photos without your knowledge or consent?
What Clearview Reveals

Every photo you've ever posted publicly is potentially in a law enforcement facial recognition database right now. You weren't asked. You weren't notified. And in most U.S. jurisdictions, it's legal — because no federal law prevents it. Clearview didn't break the law. The law doesn't exist yet.

Part 5

When Surveillance Becomes Control

There's a difference between watching and controlling. AI-powered surveillance crosses that line when surveillance data is used to reward or punish behavior at societal scale.

China's Social Credit System

Multiple municipal systems track citizen behavior — jaywalking, traffic violations, social media posts, purchasing habits, who you associate with, whether you pay bills on time. Points are added for "positive" behavior and deducted for "negative" behavior.

HIGH SCORE = REWARDS

Priority access to loans, better apartment rentals, fast-tracked visa applications, public recognition, discounts on energy bills.

LOW SCORE = PUNISHMENT

Banned from flights and trains, children denied access to top schools, public shaming on billboards, restricted internet speeds, inability to book hotels.

Function Creep: It Happens Everywhere

Don't assume this is only a China problem. Technology rarely stays limited to its original purpose:

COVID contact-tracing apps in multiple countries were repurposed for law enforcement access after the pandemic.
License plate readers deployed for toll collection now feed into police surveillance networks tracking all vehicle movements.
Student proctoring software installed for exam integrity now monitors eye movement, background noise, and room contents — in students' bedrooms.
Workplace productivity tools marketed for "collaboration" now track keystrokes, screenshots, and mouse movements every few minutes.

The Pattern

Every surveillance technology follows the same arc: deployed for a narrow, defensible purpose → expanded to adjacent uses → normalized until resistance fades → extended to purposes that would have been unacceptable if proposed originally. This is function creep, and AI accelerates every stage.

Part 6

Privacy Law: Two Fundamentally Different Approaches

How a society protects privacy depends on whether it treats privacy as a fundamental right or a consumer preference.

EUROPEAN UNION — GDPR

Privacy as a Human Right

Consent required before collecting data. Data minimization — collect only what's necessary. Purpose limitation — use data only for stated purpose. Right to erasure — request deletion of your data. Data portability — take your data to a competitor. Breach notification — 72 hours to disclose. Fines up to 4% of global revenue. Applies to any company serving EU residents, regardless of where the company is based.

UNITED STATES

Privacy as Sector-Specific Rules

No comprehensive federal privacy law. Instead: HIPAA (health), FERPA (education), COPPA (children under 13), GLBA (financial). Gaps everywhere — no protection for general consumer data, browsing history, location tracking, or social media data. The third-party doctrine means data shared with companies often loses Fourth Amendment protection. California's CCPA/CPRA is the strongest state law but only covers California residents.

Key Privacy Principles (from GDPR)
Data Minimization: Collect only what you need. Don't hoard data "just in case."
Purpose Limitation: Data collected for one reason can't be repurposed for another without new consent.
Storage Limitation: Delete data when the original purpose is fulfilled. Don't keep it forever.
Accountability: The data controller must prove compliance, not just claim it.
For Students

FERPA protects your educational records — grades, enrollment status, financial aid. But it doesn't protect the data generated by the LMS, proctoring software, or campus WiFi that tracks which buildings you enter. The gap between what's legally protected and what's actually collected grows wider every semester.

Interactive Exercise

The Consent Problem

Below is an actual excerpt from a social media platform's terms of service (paraphrased for clarity). Read it, then make your choice.

What Would You Do?

Branching Scenario: AI on Campus

Stage 1 of 3

Your college announces a new "campus safety initiative." AI-powered cameras will be installed in parking lots, building entrances, and common areas. The system uses facial recognition to identify unauthorized individuals and alert campus security in real time. The administration says it will reduce theft and improve response times to emergencies.

How do you respond as a student?

Support it — campus safety is a real concern and this will help
Demand transparency — what data is collected, who has access, how long is it stored?
Oppose it — constant facial recognition surveillance has no place on a college campus
Consider the Tradeoff

Safety matters — but what are you trading for it? The system now tracks every student's location on campus throughout the day. It knows what time you arrive, which buildings you enter, who you walk with, and when you leave. That's not just a security camera. It's a comprehensive behavioral monitoring system. Is the safety improvement worth creating a permanent record of every student's daily movements?

The Right Questions

Transparency is the minimum standard. Key questions: Is the facial recognition database shared with law enforcement? Are students notified they're being tracked? Can students opt out? What happens to the data after graduation? Is the system audited for racial bias in identification accuracy (remember Gender Shades)? Without answers to these questions, "consent" is meaningless.

Principled Opposition

Multiple universities have banned facial recognition on campus — including the University of Michigan and Georgetown. The argument: a campus that surveils its students creates a chilling effect on free expression, political activism, and the exploration that college is supposed to encourage. Students who know they're being watched behave differently. That changes the nature of the institution.

AI Interaction Lab

Explore Privacy & Surveillance With a Live AI

Ask about surveillance technology, privacy rights, data collection, or anything from this module.

Live AI Teaching Assistant20 messages remaining
Module 4 Checkpoint

Your Key Takeaways

Seven concepts you need to carry forward.

🔭

AI Surveillance Is Different in Kind

Not just more cameras — infinite scale, 24/7 persistence, retroactive search capability. Changes "who is this person doing?" to "what has everyone been doing, everywhere, all the time?"

👣

Your Data Trail

Weather apps track location. Grocery cards reveal health conditions. Fitness trackers expose military bases. Smart speakers record conversations. Social media knows you better than your spouse.

👆

Biometric Data Is Permanent

You can change a stolen password. You can't change your face, fingerprints, or iris patterns. Once biometric data is compromised, it's compromised for life.

📸

Clearview AI

30+ billion photos scraped without consent. Used by 2,200+ law enforcement agencies. Every public photo you've posted is potentially in a facial recognition database right now.

🔄

Function Creep

Surveillance tech deployed for one purpose always expands. COVID apps → law enforcement. Toll cameras → vehicle tracking. Exam proctoring → bedroom monitoring. The pattern is universal.

🌍

GDPR vs. U.S. Approach

EU: privacy as a human right (GDPR). U.S.: no comprehensive federal law, sector-specific rules with massive gaps. Your browsing history, location data, and social media data have no federal protection.

🤝

Manufactured Consent

When opting out means losing access to essential services, consent isn't voluntary — it's coerced. 76 work days per year to read every privacy policy. The consent model is broken.

Module 4 Assessment

Check Your Understanding

5 questions drawn from the module. You need 80% to pass.

Module Complete

Your Results

0/5
0%

STUDY GUIDE

Download the study guide for this module as a reference.

📄 Download Module 04 Study Guide
1 / 14