[F]law School Episode 1: “Suppression by Surveillance”
How Corporate Technologies Fuel Crackdowns on College Protests
Jessenia Class
September 1, 2024
Summary:
As campus protests swept the nation last spring, invasive surveillance technology put protestors and student organizers in precarious positions. Through targeted fear-mongering, tech surveillance companies changed cities’ and university’s perceptions of activism. From racially-biased facial recognition to predatory cell towers, corporations stripped protestors of constitutionally protected speech and manifestations of democracy for profit. In this episode, Jessenia Class joins Sam Perri and Reya Singh to break down protest suppression tactics, corporations’ tactics for dodging accountability, and the importance of storytelling as a pathway to justice.
Guest Bio:
Jessenia Class is a law student at Harvard Law School and a graduate of Harvard College. Before law school, Jessenia was a program associate at an organization that engaged in public interest law and philanthropy.
Editors:
Special thanks to Mirei Saneyoshi and Pragnya Vella for production and editing assistance.
Chapter Markers:
- Introduction [0:00]
- Campus Protests in 2024 [3:10]
- Dangers of Surveillance Technology [5:28]
- Misleading Corporate Narratives [8:00]
- Protest Suppression [11:28]
- Holding Corporations Accountable [15:21]
- Ruse of Consent for Protestors [19:33]
- Corporate Power Imbalances [21:18]
- Impact of Storytelling [23:22]
- Perseverance Under Surveillance [25:27]
- Action Steps & Resources [27:38]
[F]law Resources:
- Suppression by Surveillance by Jessenia Class ([F]law Article)
- Corporations are Keeping Cop City Alive by Jessenia Class
- Shots Fired, and Profited On: Inside the Campaign against “ShotSpotter” in Chicago, by Anna Bower
- Nowhere to Hide: A World Without Privacy, by Jessica Grubesic
- Surveillance Advertising, by Amer Mneimneh
Additional Resources:
- Chronicle: College Protest Encampments
- Supreme Court: Grants Pass Opinion (pg. 20 references recent student protests)
- NYT: Student Protestors Doxxed
- ACLU: Protest Surveillance Technology
Washington Post: Spotter Edu Location Tracking - Professor Kyle Jones: Student Privacy Researcher
- NLC: Facial Recognition Technology
- Medium: Facial Recognition Minority Biases
- Vox: Facial Recognition Surveillance
- WIRED: Rekognition’s Facial Recognition Harms
- ACLU: 2018 Shareholders’ Letter on Pausing Facial Recognition
- ACLU: Know Your Rights Packet
- Surveillance Technology Oversight Project
- Electronic Frontier Foundation
- The Age of Surveillance Capitalism Book
Listen, rate, and subscribe!
- Podcast Home: Our podcast episodes can be found at flawschool.org
- [F]law Website: Find more [F]law articles and content at theflaw.org
- Systemic Justice Project: All [F]law content is a product of the Systemic Justice Project at systemicjustice.org
- Newsletter Sign-Up: Subscribe to receive curated content from The [F]law here.
- Contact Us: Have questions, comments or feedback? Reach out to us at justice@law.harvard.edu.
- Listen to [F]law School on Your Favorite Platform:
If you enjoyed this episode of [F]law School, please leave us a review wherever you listen to podcasts! Class dismissed!
Episode Transcript
(This transcript was created by automated process and contains errors and does not include the introductions.)
Sam Perri: Hey, welcome to Flaw School, a podcast that explores the flaws in our legal system. We’re today’s hosts. I’m Sam Perri.
Reya Singh: And I’m Reya Singh.
Sam Perri: And we are so excited to be hosting Flaw School’s first ever episode.
Reya Singh: Every two weeks, we interview law students to uncover the role of corporate actors in producing many of our most urgent social problems, and the troubling tale of corporate actors shaping, bending, capturing, and breaking the law in their favor. In this episode, we’ll be discussing suppression by surveillance.
Sam Perri: Today we’re joined by Jessenia Class. One of the most thoughtful, kind and passionate people that I know. Who just so happened to write an incredible article that we have the pleasure of discussing with her today. Welcome to Flaw School, Jessenia.
Jessenia Class: Thanks for having me.
Sam Perri: All right. Class is in session. Can you start by giving us just a little bit of information about you? Just so the listeners can get to know you.
Jessenia Class: My name is Jessenia. I’m a current law student, and I will be doing civil rights work after law school.
Sam Perri: Awesome. And can you tell us just a little bit about your paper? Just if someone hasn’t read the article before. What should they now getting into this episode?
Jessenia Class: So I’ll be talking about a paper I wrote recently about surveillance technology in cities and universities. And then we’ll take a step back and look at the corporations that push for these technologies and what they were saying and what kind of threat this poses to people who are doing activism.
Reya Singh: Thank you so much, Jessenia. I guess I want to start with a really important question. And what inspired you to write this article?
Jessenia Class: Yeah. There’s just been so much activity in the past year. You look across all schools, law schools, undergraduate schools and big cities. You’re seeing so much organizing and movement work and people rallying together to, put their feet to the ground and, and really, embody their principles and push the actors around them to change. And there’s been so much activity that even the Supreme Court had to say something about it in a recent opinion. But that has had consequences. A lot of corporate actors and, from my perspective, big law firms, have taken action against people, particularly, student organizers, for participating in these protests. So I saw all of this around me, and I grew curious about the steps that universities were taking to respond to organizing on their campuses. And then this snowballed into research around the surveillance structures that they already have in that span of cities. And I thought about the application of these surveillance structures to protests and the corporate actors behind them.
Reya Singh: Thank you Jessenia. Talking more about your perspective, can you tell us a little more about what you’ve experienced as a law student in terms of the student activism and protesting and what you’ve learned as a law student in terms of student activism and protesting, and what you’ve witnessed take place on campus?
Jessenia Class: It’s been really tough seeing the, actions that have been taken against students, for participating in protests. One really common example has been that a lot of students have been doxxed. And for people who aren’t familiar, doxxing is when, people’s personal-identifying information is posted online. And this happened a lot, unfortunately with people that are, were participating in protests, in my view, a lot of students, their information was posted on so many different websites. There were, busses that were circling around the campuses, shaming these students. And it’s been really, troubling to see and it really, in my view, centered the conversation around the power that these corporations and technologies have to really, impact people’s lives.
Reya Singh: What sort of technology is being used to surveil student activism and protests?
Jessenia Class: So in the article I talk about two different buckets of surveillance technology in this article. The first one I talk about are Stingrays. These are cell site simulators, often used by law enforcement. And these are used generally to track people’s location. We’ve been seeing a lot of different actors use this kind of technology, often bouncing off of, cell phone towers or, using local Wi-Fi networks in order to locate individuals. So certain universities have also been taking advantage of this, technology. So, for example, Syracuse University uses a program called SpotterEdu that allows it to track its students locations to see whether they’re in class, for tracking attendance. Another technology that I talk about in the paper is, as many people know, facial recognition technology, which, universities and cities have been using in order to identify students. And this has been used in a lot of different ways. For example, the University of Miami used, facial recognition systems to locate students that had participated in a protest for cafeteria workers on campus.
Sam Perri: As a public defender, I hear all of this and I’m like, oh my God, there is so much harm that can come with these technologies. But for our listeners who might not have all the pieces pulled together or just like straight up curious, what concerns come to mind for you when you think about these technologies and especially when it comes to activism?
Jessenia Class: There are a lot of concerns with this technology. One of the first things that jumps to mind is that often the technology is built upon really faulty data, structures. So, for example, one of the most prominent, bad examples of this is facial recognition technology that misidentifies, people that are black and brown and come from minority communities. So often this, the data is trained on your stereotypical, middle aged white man, and it doesn’t get it doesn’t identify people who don’t fit that mold So that has been used really negatively against these populations.
Jessenia Class: So the data set is based on biased information. But even if the data got better, there’s still so many other issues with this technology. You could think of the privacy concerns or the due process concerns of taking this information away from people, that there are just so many different ways to think about how tricky and troubling the integration of this technology is into our cities and universities.
Sam Perri: Okay, so lots of issues with the data. With facial recognition and just thinking about how that can just have such a huge impact on a person’s life if they’re mis-recognized and misidentified, it’s a little bit mind blowing. And I imagine that as these technologies continue to be used over time, more of these problems have become visible, right? So with this increased visibility of these problems that we’re seeing with the technologies. has that had any negative impact in the use of these technologies? Do we see them being used less or with more hesitation or caution, or are we continuing to see, like a rise in their growth in use?
Jessenia Class: Yeah. To me, this reads like a question of like which stories are coming across most prominently. And what I talk a lot about in this paper is that corporations are wielding their narratives to kind of put to the side our concerns with the data and the structures itself, and focus on what are the what are the needs. Why do we need this kind of technology? So there are so many examples that I talk about in this paper of different corporations and their actors, kind of drumming up fear around, in cities and in universities to convince consumers to buy their products. A great example of this is if you look at Clearview AI. They engaged in a lot of marketing and conversations with police departments to convince them that their technology would help them protect children and enhance the safety of communities across the country. And other corporations did the same thing. Take Panasonic, for example. They released a marketing video that showed surveillance of students’ social media to assess quote unquote threats and really drummed up concern around school-based violence to say, hey, buy our product, we will bring the solution to you. And so, despite there being a lot of very valid concerns with the data and the biases they’re in, corporations are spinning the conversation. They’re saying, hey, we can solve these safety problems. Don’t look at what our technologies do, but look how we can solve these concerns. That I might argue, are they’re drumming up a lot, particularly as it relates to protests and political demonstrations. But that conversational switch is really important, I think, in convincing a lot of people, that this technology is necessary and making it more pervasive.
Reya Singhv I think that’s so interesting and like, honestly, very frightening because you think of it as like exactly what you suggest. And, you know, they’re spinning the conversation kind of in a way, like gaslighting and being manipulative with something that is it’s taking freedoms away and it’s like infringing on rights, and they are able to spin it in a way to make it seem so important and so needed. And as someone in undergrad and as someone who was, you know, very much watching and involved in the protests on my campus, I can’t help but have fear and just, honestly, disgust for the way that these narratives are being spinned around. But I do want to ask, as these universities and cities become more comfortable with using these technologies, how are they using them and how might they use it in the future?
Jessenia Class: Yeah, I think those those fears are really valid. And my argument in the paper is I speculate in some part speculate, but in some part I track that this is already happening in cities and universities, but I largely speculate that it’s going to become more and more common for universities and cities to, quote unquote, deal with, protests in organizing and movement building by relying on these formal technologies, in order to in a way, suppress students from engaging in this kind of representation of their values and discourage people from joining in these efforts. And so I think that is what I imagine, what cities and universities are doing with this technology and what people who are organizing or building the links to should look to and consider when they’re thinking about how they should strategize.
Reya Singh: Jessenia, how are the people you are talking to and the people you’ve seen? How are they reacting to this?
Jessenia Class: It’s a mixed bag, and I think it’s really a personal decision that varies widely. Some people are fearful and with good reason of having their information posted online. And the impact that that would have on their career and on their families. Other people are willing to take risks and are so passionate about their cause that they’re committed to doing whatever it takes. Certainly there will be people who will continue to act in the face of more surveillance. But there are people that will be dissuaded from participating. And I think that’s really, frightening to think that people will be less interested in expressing their political views as a result of the adoption of this technology in cities and universities.
Sam Perri: Yeah. That’s huge. That really hits home I think thinking about how we feel so personally affected by the actions of these corporations, right, who are producing these technologies and how they’re being employed. You mentioned in your article that the Stingray can mimic cell phone towers and that that creates barriers to sources of cell service. That really struck me, something I hadn’t really thought about. As someone who’s not technologically inclined, I don’t often think about bandwidth. I don’t know why I have bandwidth, I don’t know why I don’t have bandwidth. But this is wild, right? Those people, those corporations with the money to surveil, with the power to surveil, really can control how and when people communicate, whether it’s, you know, during some form of activism or just in their daily lives. Right. And so it’s wild to think that we can think about the role that surveillance plays during demonstrations, during protests. But even outside of those fears as well. Were you thinking about that, too, when you were writing the article?
Jessenia Class: Yeah for sure. I totally agree. I focused on some of the examples of Stingrays or facial recognition technologies, or some of the programs like SpotterEdu that cities and universities are adopting. But I think your concern is a broader one that I think is right, is that the more that these technologies grow, and there are likely many others that I don’t talk about in this article that will be important. These are all tools that powerful actors are deciding to use without the consent or the input of the people that they are largely impacting.
Jessenia Class: And that is going to have a huge role in what I believe is people’s main way of engaging in democracy, in expressing their opinion, in going to the streets, engaging with their, local authorities, people who hold power, if they are going to be dissuaded by these technologies, that’s going to be a really bad thing.
Sam Perri: And that brings me to the question of accountability. Right. Like such a huge impact, how can we hold these corporations who are both creating and the corporations who are employing these technologies accountable for their impact? For instance, you mentioned that Amazon shareholders were concerned about Recognition, the Amazon surveillance software. And I started reading through an article. You linked a wired article. My jaw dropped when I read in there that the Amazon Web Services CEO, Andy Jassy, said to employees, quote, if we find people are violating folks constitutional rights, they won’t be able to use the services any longer end quote. So loss of access to the technology is the consequence here. I mean, can we really imagine a world where people who are using, sanctioning the use of, or creating these technologies are actually held accountable for the harms they’re producing? Besides, maybe them just losing access to it? Is it possible when we think about how, you know, a technology can start in some way, shape or form, and then when it’s purchased, it can be adapted by whatever corporation to fit their particular needs. Like who can we hold accountable? Can we hold anyone accountable?
Jessenia Class: I like to think so. And I tell myself this, as a current law student hoping to do this work in the future, I have to hold onto some hope that that there are ways in which we could, encourage or push those in power to embrace the changes that we’d like to see in our society and in our systems. And I think that there are a couple of tools that we might rely on to to make those changes or to at least pursue those changes. So one that I’m thinking about, especially with, as you mentioned, Recognition is the, group of shareholders that wrote a letter to, Jeff Bezos, CEO of Amazon, in 2018, to put pressure on him to change and, and, reconsider, their facial recognition technology. The group talked about how they used a lot of technical corporate law terms. They’re talking about how, they’re they don’t see any evidence and documentation of fiduciary oversight. Something that is really removed from the actual concern that people have with this technology, but they’re leveraging the tools at their disposal in order to try and push actors towards a future that doesn’t rely on these technologies. And I think that, to me is positive, though. And Amazon did change their actions. They took a they put a one year moratorium on the facial recognition software. Of course, that was only one year. But that that sparked hope for me that convinces me that this is an option for change to happen. Other ways that I think are incredibly important is local organizing and local power building. You can see examples, in the article, I talk about different cities like Somerville, Massachusetts, where people have pushed their local cities to adopt ordinances where they will not use facial recognition technology. You’ve also seen, petitions that were circulated that 50 schools, 50 universities across the country have committed to not using facial recognition technology. And examples where protest itself, which is very ironic, but the very, I think, key to this discussion, protest itself at Northeastern University stopped the university from implementing usage monitors that could be used to track students’ locations at their school. So I think that there there are the typical, pathways to pursue change policymaking, pushing your local legislatures. But I think there are a lot of tools to be used in this case, like policy making, pushing legislatures, and all of it stems from building power and raising awareness of this issue and having conversations with people.
Reya Singh: Now Jessenia, I would love to switch gears a little bit and think about how this topic is intertwined with the elements of systemic justice and injustice. In reading your article, I found myself toying with the notion of consent. a university like Columbia, for example, can say that the students consented to their text messages being read because they were sent on university Wi-Fi networks. And as someone who really relies on my school’s Wi-Fi network, that was honestly really dist urbing to me think it’s it’s wrong to say that people use Wi-Fi with the idea, that like we are comfortable with the owner of the network inspecting and looking at any piece of data or messaging that runs across it. But what do you think about that?
Jessenia Class: Yeah. I think you’re heading broadly on a theme that I wanted to get across in my article, which is that people and in this case, I focus a lot on, organizers and people who are protesting, but people are being subjected to these systems of surveillance without any consent. Their cities and schools might say that because they’re using networks, they’re but therefore they’re consenting. And then they switch the conversation to then put the blame on the protester or the organizer to say, actually, it’s your fault that you are using these systems. So therefore you’re subjecting yourself to this where it’s really these are the systems that you are placed in and you don’t really have any choice. There’s no bargaining. There’s no opportunity to say, hey, I don’t want to participate in this kind of structure. And so that framing between this is an individual’s fault as opposed to this is a systems fault, and there are problems with this system that we need to change, that’s something I really wanted to talk a lot about and focus on in this article.
Sam Perri: Yeah. That’s huge. I mean, the idea of consent, right. And manufactured consent in this case really points back to, you know, an idea we talked about earlier about corporate power, and how they can wield that power. For instance, you know, what’s coming to mind for me right now is you mentioned in your article that once law enforcement has identified the cell phone of a protester, maybe that’s through, the Wi-Fi network. Right. They can subpoena the cell phone company to provide both the name and address associated with that account. And, you know, much more other identifying information. And again, you know, as a public defender, I know firsthand how difficult it can be to access that information and other information related to cell phone use from the other side. For instance, I’ve had an experience, right, where we subpoena Snapchat. Snapchat will not answer us. And, you know, if they’re forced to give a response, they’ll say, “we can’t provide that information because it’s the individual user’s Information, and you should get that from them. It’s really a power imbalance, right? Like the corporations themselves can choose when they want to respond and to whom they want to respond. And I think it’s a bit of a narrative that we don’t hear too often. Like I did not expect that going into my work thinking that, you know, the company won’t respond to me, but they’ll respond to law enforcement. And I guess I’m wondering, do you think that corporations are intentionally hiding that reality that, you know, there is that power imbalance? They’ll choose who they want to respond to. If people aren’t going to make a fuss about it, it doesn’t matter.
Jessenia Class: Yeah, I think that’s right. And I think it’s it’s probably a mix of both corporations using the powerful tools that they have at their disposal to intentionally hide things or place documents under seal and just use the army of lawyers that they have a disposal to find any tool in the toolbox to shield them from liability. It’s both that and then it’s also the fact that people aren’t aware that these things exist. People don’t know enough about these structures to raise the alarm and gain access to their own information. and that’s why I think some of the things that I talk about in the article, for example, the ACLU of DC puts together Know Your Rights packet to inform organizers about technology that might impact them at protests. I think that’s so information to share that information with people and to bridge the gap so that even though it’s only a little bit, some of the power is being bridged.
Sam Perri: I 100% agree with you. I mean, I was really locked into your article honestly, from the beginning. But throughout reading, I was trying to think, you know, in also preparing for this episode, you know, if there are someone who is reading this article that’s a bit more skeptical about the narrative and about the stories are being shared through it, you know, what would I be thinking in reading this? It was really trying to get myself in that headspace. And so I got myself to the starting point of like, okay, if I was that person, what are my values? What are the things top of mind? Easily, my mind went to efficiency, right? That’s like the quintessential goal of our corporation to, you know. Well, they’ll say that, right? “We want to do things quickly and effectively. And, that’s what we should aim for.” And I think about technologies like SpotterEdu, which we’ve talked a bit about earlier and how it like tracks the students’ location and attendance in class. Right? And if I’m someone who is efficiency oriented, then I’m thinking, this is great, right? We’re tracking students attendance through technology so professors can spend more time in the classroom focusing on material. That’s awesome. Everyone’s getting their money’s worth. Great. But it’s also terrifying, at least to me. So I guess I’m just wondering how do we combat these, like, efficiency oriented stories? How do we combat, you know, the views of some people who are going to start an article a bit skeptic, maybe at the end, still a little bit skeptical. How are we combating these stories? How are we talking with folks who are in that efficiency oriented mindset about these issues?
Jessenia Class: I think for me, it it all comes down to storytelling. We need to change the narrative, and we need to challenge the legitimacy of these practices. If people are making an efficiency based argument, we need to challenge that by reorienting the conversation around the impact of these kinds of technologies, or talk about the issues underlying these technologies. And some organizations take this tactic and some of them sidestep. I’ve seen a lot of advocacy methods that have focused less on the narrative and more on different procedural ways to challenge these practices. For example, some organizations have focused on due process issues or privacy issues. And I think that could be effective. But I think that can only truly be effective if it’s coupled with a media strategy that talks about how this is hurting people in real life, and how that really matters.
Reya Singh: I feel like with this surveillance, it seems like the students are getting there, you know? Power taken away and it gives them a lot to be frightened about. And I share those feelings of being scared. But it also does seem like there’s hope. And there are plenty of people we saw from the protests on campus throughout the encampments that despite, you know. Administrator saying, you’re going to get your degrees taken away, you’re going to lose this, right? You’re going to lose that right. People so strongly believe and what they, you know, fight for. And I think that’s beautiful, that they are willing to deal with those consequences for what they know is right. But what sense did you get when you were talking to people and learning about their perspective?
Jessenia Class: Yeah. I felt really similarly. I was surprised and heartened to see students that were undeterred in their commitment to organizing and pushing for change through, activism. These tools, these technological tools are really scary, and fighting against a system and asking for it to change is also really scary. And I want to do that in my career. And sometimes it’s really overwhelming to see, especially in this day and age, when you look at the courts and you see that they’re not doing anything remotely hopeful. It’s very scary. And to see people with real consequences, being subjected to, to surveillance technology and even still double down in their commitment to their cause and continue pushing forward for the change they want to see in these systems and in the world. That’s inspiring. That motivates me. And I take that with me, as a reminder to be committed to the changes that I want to see in this world and I that motivates me in my career.
Reya Singh: That beautiful. I think it gives a lot of hope, But for our listeners who are interested in this topic, what resources or further reading would you recommend?
Jessenia Class: I would recommend checking out some of the great work that people are doing, like Surveillance Technology Oversight Project or Stop. Does a lot of work on this front. Also the Electronic Frontier Foundation. They’ve done some great research into these kinds of questions. There’s also a book, The Age of Surveillance Capitalism by Shoshana Zuboff. It’s a massive book. It’s like 700 pages, but it’s amazing. It’s so detailed. And she dives into a lot of the themes that I touch on in this article, but does it much better than I can.
Reya Singh: I think everyone at Flaw School, and hopefully everyone listening, can agree that you are someone who gives us hope. But I would like to know what gives you hope.
Jessenia Class: What I’d love for people to take away from this article is a reminder that they have power. You are an actor that has a lot of agency to push your systems in the ways that you want them to change, and it’s hard, and they have a lot of power over you and it’s very, very difficult. But when you’ve been together with people and you have conversations and you connect over shared interests and shared causes, that is the catalyst to change. I believe, and I really hope that people feel empowered to do that with their own causes, in their own communities and make the change that they’re interested in seeing in a world.
Sam Perri: I hate to say this, but that is all the time we have for today. Jessenia, thank you so much for joining us on Flaw School. We are so happy to talk with you any day about how corporations are wreaking havoc on society.
Jessenia Class: Thanks so much for having me. It was a pleasure.
Reya Singh: I also want to say wonderful conversation. Super, super insightful. And it left me with a feeling of hope. As someone who is an undergrad very much, you know, dealing with these things right now. I think a lot of people will be touched by this conversation. So thank you. Jessenia.
Sam Perri: And thank you Reya for being such an awesome co-host. It was great to do this session with you. Looking forward to more conversations about corporations wreaking havoc on society with both of you moving forward.
Reya Singh: Absolutely. I adore you, Sam, and thank you for being an amazing co-host for everyone else. If you’re interested in reading Jessenia”s full article or learning more about the flaws in our legal system, check out the full magazine at theflaw.org.
Sam Perri: If you enjoyed this episode, first make sure to check out the show notes. There’ll be some awesome links and references there. And make sure to subscribe to our podcast wherever you listen to your podcasts. You can also check out flawschool.org for more content. Thank you all for listening. Looking forward to talking at you in the future. Class is dismissed.