By Evan Greer and Evan Selinger
People say kids these days don’t care about privacy. But they are wrong. Just because TikTok videos and Instagram selfies are popular doesn’t mean privacy is dead.
Young people around the world are setting boundaries for their data and protecting their right to privacy. Last week, backlash from students forced UCLA to drop its controversial plan to become one of the first major universities in the United States to use facial recognition surveillance to scan and analyze the faces of everyone on campus. And when students at Oakland Community College were initially told by administrators they weren’t allowed to hold a forum on campus to talk about the school’s potential use of face surveillance, the school caved under pressure after students and activists raised First Amendment concerns.
These victories are part of a growing movement to prevent an Orwellian technology from turning the college experience into a Black Mirror episode. More than 50 prominent institutions of higher learning, including Harvard, MIT, University of Michigan, and Columbia, have taken a stand for their students’ basic rights by confirming that they have no plans to use facial recognition technology on their campuses. But dozens of other schools have refused to answer even basic questions about whether or not they’re considering experimenting with face surveillance or using it on their campus population.
Companies hoping to make a ton of money selling facial recognition software are aggressively marketing it to schools. They make all kinds of claims about expected benefits in their pitch: everything from deterring crime to busting students for smoking and preventing potential school shootings.
Don’t believe the hype or fall for the misguided conviction that meager gains are worth the terrible costs for obtaining them. In reality, facial recognition systems make campuses less safe, a lesson high schools are learning the hard way. While appeals to preventing “the next Parkland” are emotionally hard-hitting, and while installing facial recognition systems might seem to be a sound proactive measure, the facts suggest otherwise. Simple techno-fixes aren’t going to be effective, and facial recognition technology can make schools feel more like prisons, especially for marginalized students who already feel watched at every turn.
Imagine walking down the quad on your way to class stressing about a test when you’re suddenly swarmed by armed police or campus safety because a facial recognition algorithm falsely matched you with someone else who was entered into a non-transparent watch list. Or getting locked out of your dorm in the rain because the camera that scans your face for entry didn’t recognize you. Sure, it can be inconvenient to bring a meal card or wallet to the cafeteria, but wouldn’t it be worse to get charged for another student’s purchases if an automated payment system, which often don’t account for racist generalizations, determines you look the same? Or for hackers to sell a stolen database of everyone’s face scans, including yours?
Even if you trust your campus administration to use the technology responsibly, abuse is inevitable. AI companies that are in the facial recognition technology business tell us that “data breaches are a part of life” and history has shown that policies against misusing surveillance technology aren’t infallible. What happens when a criminal, a law enforcement agency, or just a straight-up stalker gains access to the system?
And the U.S. government’s own studies show that commercial-grade facial recognition software exhibits systemic racial and gender bias. You’re way more likely to get screwed over by the software if you’re a student of color, a woman, or transgender. Implementing this technology on college campuses will automate and exacerbate existing discrimination within academic institutions and the criminal justice system. For students of color, and LGBTQ+ students, the impacts of facial recognition could be deadly, or land an innocent person in jail.
We have no idea what the long term psychological impacts of experimenting on students with this technology is, but we can guess: Students who are under constant surveillance will be further stressed out, and the anxiety can compromise academic performance and overall wellness.
Well-intentioned administrators and teachers might believe that limiting the use of facial recognition technology to non-threatening situations might mitigate those risks — think about the benefits of speeding up lines at big campus events or taking attendance in large lecture halls. Unfortunately, as the pushback against facial recognition technology at concerts demonstrates, there’s no such thing as a truly safe way to deploy facial recognition technology. “A future where we are constantly subjected to corporate and government surveillance is not inevitable,” Tom Morello of Rage Against the Machine wrote for BuzzFeed, “but it’s coming fast unless we act now.”
Given the many ways the technology can be used and the ease of adding its functions to existing cameras, any deployment will normalize the practice of handing our sensitive biometric information over to private institutions just to get an education. Indeed, some educators are using facial recognition technology to infer students’ emotions to determine things like whether they find material engaging or boring. But facial characterization tends to be underwritten by junk science and integrating it into education risks dehumanizing students and favoring overly-reductive approaches to teaching. Frankly, students can always be taught and assessed in less privacy-invasive ways.
Some facial recognition proponents claim the big problems will go away once the technology improves. This isn’t true. If facial surveillance ever works perfectly, it will be even more dangerous. When students can be tracked everywhere they go, they’ll be anxious about exercising free speech and free association — whether it’s meeting up to discuss controversial ideas or organizing a protest. Indeed, the mere prospect of widespread facial surveillance will have a chilling effect on campus expression. Students who are afraid to be themselves and express themselves will pull back from crucial opportunities to experience intellectual growth and self-development — and students from marginalized communities will be the most affected.
Facial recognition technology is uniquely dangerous. That’s why thousands of students, faculty, and alumni have joined a campaign launched by Fight for the Future and Students for a Sensible Drug Policy calling on administrators to ban the use of this technology for campus surveillance. Groups like the ACLU, Mijente, Color of Change, and the National Center for Transgender Equality endorsed it, too. Students across the country are planning a national day of action on March 2, and it’s easy to get involved.
Facial recognition technology isn’t widely used on college campuses yet. Let’s keep it that way.
Evan Selinger is a Professor of Philosophy at Rochester Institute of Technology.