Privacy & Surveillance
Enter your information to begin Module 4. Your name personalizes your experience and appears on your completion certificate.
Surveillance isn't new. Governments have monitored citizens for centuries. What's new is the scale, persistence, and intelligence that AI brings to it.
Requires a person watching a camera feed, reading intercepted mail, or following a subject. Limited by human attention, fatigue, and budget. A city with 1,000 cameras needs hundreds of officers to watch them. Most footage is never reviewed.
One AI system monitors 1,000 cameras simultaneously. It never gets tired. It identifies faces, tracks movements, detects "anomalous behavior," and cross-references against databases — in real time. It can retroactively search weeks of stored footage in seconds.
Traditional surveillance asks: "Is this specific person doing something suspicious right now?" AI surveillance asks: "What has everyone been doing, everywhere, all the time?" That's not a difference in degree. It's a difference in kind.
Most people think of surveillance as cameras and wiretaps. The reality is that you generate surveillance data voluntarily, continuously, through devices you carry willingly. Tap each activity to see what data it creates.
You just tapped each item to see what data it collects — but you probably won't stop doing any of these things. That's the privacy paradox: people consistently say they value privacy while behaving in ways that sacrifice it. We trade our data for convenience because the cost feels invisible and the benefit feels immediate.
Not all personal data is equal. If your password is stolen, you change it. If your credit card is compromised, you get a new one. But what happens when the data that's compromised IS your body?
Unique. Permanent. Stored by phone manufacturers, border agencies, and law enforcement globally. Once compromised, compromised forever.
128+ measurements of your face. Collected without consent by cameras in stores, airports, stadiums. You can't opt out of having a face.
More unique than fingerprints. Used in border control and increasingly in commercial authentication. Worldcoin scanned irises of millions in developing nations in exchange for cryptocurrency.
Password leaked → change password. Credit card stolen → new card issued. SSN compromised → credit monitoring. The damage is containable.
Facial geometry leaked → you can't get a new face. Fingerprints compromised → your fingerprints are compromised for life. There is no reset button.
Illinois was the first U.S. state to pass biometric privacy legislation (BIPA, 2008). It requires informed consent before collecting biometric data and allows individuals to sue for violations. Facebook paid $650 million in 2021 to settle a BIPA class-action lawsuit for using facial recognition on user photos without consent. This single law has done more to constrain biometric data collection than any federal regulation.
In January 2020, the New York Times revealed that a small startup called Clearview AI had built the most powerful facial recognition tool in the world — by scraping more than 30 billion photos from social media platforms without anyone's consent.
Every photo you've ever posted publicly is potentially in a law enforcement facial recognition database right now. You weren't asked. You weren't notified. And in most U.S. jurisdictions, it's legal — because no federal law prevents it. Clearview didn't break the law. The law doesn't exist yet.
There's a difference between watching and controlling. AI-powered surveillance crosses that line when surveillance data is used to reward or punish behavior at societal scale.
Multiple municipal systems track citizen behavior — jaywalking, traffic violations, social media posts, purchasing habits, who you associate with, whether you pay bills on time. Points are added for "positive" behavior and deducted for "negative" behavior.
Priority access to loans, better apartment rentals, fast-tracked visa applications, public recognition, discounts on energy bills.
Banned from flights and trains, children denied access to top schools, public shaming on billboards, restricted internet speeds, inability to book hotels.
Don't assume this is only a China problem. Technology rarely stays limited to its original purpose:
• COVID contact-tracing apps in multiple countries were repurposed for law enforcement access after the pandemic.
• License plate readers deployed for toll collection now feed into police surveillance networks tracking all vehicle movements.
• Student proctoring software installed for exam integrity now monitors eye movement, background noise, and room contents — in students' bedrooms.
• Workplace productivity tools marketed for "collaboration" now track keystrokes, screenshots, and mouse movements every few minutes.
Every surveillance technology follows the same arc: deployed for a narrow, defensible purpose → expanded to adjacent uses → normalized until resistance fades → extended to purposes that would have been unacceptable if proposed originally. This is function creep, and AI accelerates every stage.
How a society protects privacy depends on whether it treats privacy as a fundamental right or a consumer preference.
Consent required before collecting data. Data minimization — collect only what's necessary. Purpose limitation — use data only for stated purpose. Right to erasure — request deletion of your data. Data portability — take your data to a competitor. Breach notification — 72 hours to disclose. Fines up to 4% of global revenue. Applies to any company serving EU residents, regardless of where the company is based.
No comprehensive federal privacy law. Instead: HIPAA (health), FERPA (education), COPPA (children under 13), GLBA (financial). Gaps everywhere — no protection for general consumer data, browsing history, location tracking, or social media data. The third-party doctrine means data shared with companies often loses Fourth Amendment protection. California's CCPA/CPRA is the strongest state law but only covers California residents.
FERPA protects your educational records — grades, enrollment status, financial aid. But it doesn't protect the data generated by the LMS, proctoring software, or campus WiFi that tracks which buildings you enter. The gap between what's legally protected and what's actually collected grows wider every semester.
Below is an actual excerpt from a social media platform's terms of service (paraphrased for clarity). Read it, then make your choice.
Your college announces a new "campus safety initiative." AI-powered cameras will be installed in parking lots, building entrances, and common areas. The system uses facial recognition to identify unauthorized individuals and alert campus security in real time. The administration says it will reduce theft and improve response times to emergencies.
How do you respond as a student?
Safety matters — but what are you trading for it? The system now tracks every student's location on campus throughout the day. It knows what time you arrive, which buildings you enter, who you walk with, and when you leave. That's not just a security camera. It's a comprehensive behavioral monitoring system. Is the safety improvement worth creating a permanent record of every student's daily movements?
Transparency is the minimum standard. Key questions: Is the facial recognition database shared with law enforcement? Are students notified they're being tracked? Can students opt out? What happens to the data after graduation? Is the system audited for racial bias in identification accuracy (remember Gender Shades)? Without answers to these questions, "consent" is meaningless.
Multiple universities have banned facial recognition on campus — including the University of Michigan and Georgetown. The argument: a campus that surveils its students creates a chilling effect on free expression, political activism, and the exploration that college is supposed to encourage. Students who know they're being watched behave differently. That changes the nature of the institution.
Ask about surveillance technology, privacy rights, data collection, or anything from this module.
Seven concepts you need to carry forward.
Not just more cameras — infinite scale, 24/7 persistence, retroactive search capability. Changes "who is this person doing?" to "what has everyone been doing, everywhere, all the time?"
Weather apps track location. Grocery cards reveal health conditions. Fitness trackers expose military bases. Smart speakers record conversations. Social media knows you better than your spouse.
You can change a stolen password. You can't change your face, fingerprints, or iris patterns. Once biometric data is compromised, it's compromised for life.
30+ billion photos scraped without consent. Used by 2,200+ law enforcement agencies. Every public photo you've posted is potentially in a facial recognition database right now.
Surveillance tech deployed for one purpose always expands. COVID apps → law enforcement. Toll cameras → vehicle tracking. Exam proctoring → bedroom monitoring. The pattern is universal.
EU: privacy as a human right (GDPR). U.S.: no comprehensive federal law, sector-specific rules with massive gaps. Your browsing history, location data, and social media data have no federal protection.
When opting out means losing access to essential services, consent isn't voluntary — it's coerced. 76 work days per year to read every privacy policy. The consent model is broken.
5 questions drawn from the module. You need 80% to pass.