Freedom Unfinished
Freedom Unfinished
E2: Rise of the Surveillance State
Our most personal and intimate information living online is now a commodity to be bought and sold without our knowledge or consent. Given this new reality, how do we ensure that we are not victimized by those who profit off holding something of such value—namely, our most intimate and personal information?
Get the answer to this question and more by listening to ACLUM executive director Carol Rose and Technology for Liberty program director Kade Crockford explore big data and artificial intelligence through the lens of power, democracy and the broken systems that will determine the future of our rights.
Join us this season wherever you get your podcasts and follow the ACLU of Massachusetts on social media @ACLU_Mass for the latest updates on Freedom Unfinished, Season 1: Decoding Oppression.
Thank you to Trent Toner and the Rian/Hunter Production team for sound mixing this episode.
Newsreel #1 (00:01):
Today, our fellow citizens, our way of life, our very freedom came under attack.
Paromita Shah (00:08):
The impact on immigrant communities in Massachusetts was immediate, especially on South Asian communities, especially on Muslim communities.
Newsreel #1 (00:19):
These acts of mass murder were intended to frighten our nation into chaos and retreat.
Paromita Shah (00:24):
And I was in Boston at the time, and the planes had left from there. I'm South Asian. And I realized at that point that people who look like me were very vulnerable. That's where like our day to day reality changed.
Newsreel #2 (00:41):
We're under attack. And from your vantage point, yeah, clearly difficult to tell. We're under attack.
Newsreel #3 (00:47):
God, it's right in the middle of the building.
Paromita Shah (00:52):
The day that 911 happened, I was concerned for my brother. I was concerned for my partner. I was concerned about whether we were going to be profiled on the streets, whether people would call the police. And that was very chilling, to see the panic. It was really a very clear indication of how profiling and racism can actually be institutionalized very quickly after a major event. How long could I stand and look at a building, being very aware that when we walk through airports that we could be stopped and that we would end up on no fly lists.
Carol Rose (01:32):
The world and the American public reeled with shock and devastation. US government officials quickly acted to fund and then expand a range of secret and often unconstitutional government interrogation and surveillance programs. Congress created the department of Homeland security, which immediately became the largest federal bureaucracy in the country. Among other changes. Immigration enforcement was moved out of the justice department into an organization that had been fighting terrorism as its central mission.
Carol Rose (02:03):
Congress also created three new agencies tasked with immigration and border enforcement, Customs Border Protection, Immigration Customs Enforcement, or ICE, and US Citizenship and Immigration Services. In theory, these actions were supposed to protect the United States from another attack. Instead, it created a vast surveillance enforcement apparatus that remains with us still today. One program launched in 2002 targeted for surveillance tracking and detention thousands of people from South Asia and Arab countries, as well as people of the Muslim faith. And while US government officials said these programs were intended to protect the public, for people like Paromita Shah, these actions felt like the opposite of protection. Instead, it felt like she was being targeted, not because of anything she'd done, but rather because of who she was. I'm Carol Rose executive director of the ACLU of Massachusetts.
Kade Crockford (03:02):
And I'm Kade Crockford, director of ACLU for Massachusetts' Technology for Liberty program.
Carol Rose (03:08):
This is freedom unfinished. You're listening to episode two, Rise of the Surveillance State
Paromita Shah (03:30):
Policing and immigration enforcement have always used the age of information to build lists. And we have always known that list building is what we are most fearful of when we think about how people get targeted and tracked. Before 911, I was a brand newbie immigration attorney. After I started doing detention work in Virginia, I met people in detention centers where neighbors had called immigration based on angry statements made at television screens, people who were angry about the war in Iraq. But they felt like that was possibly a sign of treasonous intent, and so they called ice on neighbors on people that they had known.
Carol Rose (04:17):
This sort of unchecked surveillance and tracking and the systemic problems posed by Dragnet policing and deportation strengthened Paromita's journey toward being a leader in the immigrant rights movement. And she now leads Just Futures Law, an organization that specializes in combating immigration, detention, enforcement, and criminalization. Paromita works to free people unfairly put in detention, which often happens during times of public distress and panic. A 2002 domestic call in registration program called the "special registration program" became notorious for rounding up and wrongfully detaining thousands of people.
Paromita Shah (04:55):
The racism in xenophobia that erupted after 911 really, I think, showcased who's on a list, who stays on a list, and specifically through a program called special registration, which was a program that came up right after 911. It was a program that said that boys, after the age of 16, and men from certain middle Eastern countries had to register with the government and had to continuously register with the government while they were in the United States. So, let's say you are driving and you're arrested, let's say for speeding, really speeding, and you're booked. They usually take your fingerprints. Those fingerprints start with the local police department. Then they go to the FBI through secure communities. They go to the department of Homeland Security. And once those fingerprints went to the department of Homeland security... We had a question at the time, why do we need this program?
Paromita Shah (05:54):
But then the reality set in, and the reality was that so many more people were being arrested by ICE, Immigration and Customs Enforcement, which is a sub-agency of DHS. And this massive uptick of deportation could only be explained by this sharing of fingerprint data. And we realized that the program itself had features that were just frankly blatantly unconstitutional, but that the scale of that fingerprint collection was not really doing anything, but allowing police to profile you and DHS to pick you up. It is easy to jail a black person. It is easy to jail migrants. The similarities between the criminal legal system and the immigration system are very clear. They are about treating people in those systems as disposable by ensuring that it's easy to deport, just like it's easy to arrest a black person. And it has its roots, I think, in mass incarceration and systemic racism that have been there since the time that this country was founded, and frankly, with slavery.
Paromita Shah (07:10):
And I think it's really troubling to me that we can't see these connections as often as we need to, so that we can end the use of prisons and the use of immigration detention, to me, which look very much the same from my vantage point. I think more people have begun to see what's happening in these detention centers, the abuses, the violations of civil rights, the treatment, the lack of accountability, the lengthy stays in detention. These were all things that we were trying to expose 15 plus years ago, but I think are more in the public's eye now.
Paromita Shah (07:54):
We face a two part problem, which is that individuals themselves, especially black and brown folk, they do not have control over their data. I want to be specific by what I mean by big data. It's your DNA, right? It could be your face. It could be your fingerprints, your location, data, your professional role, your religious affiliation, your banking information, your family connections, or friendships, your romantic partnerships, your political views, where you travel, who you know. And then secondly, there is a data system out of control. And data is probably the most powerful commodity that is being used against immigrants. And all of that data is now going into the hands of corporations and government/ and just incidentally, the World Economic Forum estimates that data is valued at 25 trillion. worldwide.
Carol Rose (08:59):
Many economists have deemed data the "new oil." In fact, in 2017, data surpassed oil in terms of monetary value. And based on Paromita's description of data, this includes just about anything, our fingerprints, our faces, our location, where we work, where we pray, where we live, where we seek healthcare, and why, as well as whom we date and where we bank, whom we associate. In the big data economy, our DNA itself becomes worth more than oil, which was previously the world's oldest and most commodified resource. Our personal and intimate information is now a commodity to be bought and sold without our knowledge and consent. How do we ensure that we're not victimized by those who profit from holding something of such value, namely our most intimate and personal information. To make things worse. This type of information is now being used by governments to surveil and target people in historically marginalized communities.
Paromita Shah (10:01):
The systemic problems posed by policing and deportation have just eviscerated our families and eviscerated communities, and frankly destroyed our lives. And so we don't feel like we need to invest in those systems anymore. We want to build other things.
Carol Rose (10:28):
In episode one, we talked about the idea that oppression often masks itself as government action in the name of public safety. And we've been watching this expanse since 9/11 on an unprecedented scale, with biometric surveillance and other forms of dragnet surveillance and tracking. It's hard not to wonder if the so-called benefits outweigh the drawbacks. Is there any way to justify the invasion of privacy, even in public spaces when we see such disproportionately negative impacts on both individual liberty and in black and brown communities that are most often targeted and exploited?
Woody Hartzog (11:03):
Privacy is about power and the ways in which, when companies have that information and converted into data, which is easily stored, processed, analyzed, shared, they gain leverage over individuals and collective bodies of people in ways that it's hard to notice at the individual level.
Carol Rose (11:25):
That's Woodrow Hartzog, professor of law and computer science at Northeastern University here in Boston. In addition to teaching, Woody is a renowned author and influential expert on privacy, law, and robotics.
Woody Hartzog (11:39):
So, one of the main areas of my research is thinking about how the old rules of privacy, which were largely really based around sort of direct government surveillance, like wire tapping and the media sharing embarrassing facts about people, about how that old way of thinking about privacy doesn't really fit the modern surveillance data capitalist economy. And when you think about privacy as power, you can see that a sort of cycle gets created where companies offer this benefit. People analyze it on the individual level, like, what's in it for me? Oh, a free app or a slightly more convenient experience.
Carol Rose (12:24):
We've already seen this play out in our conversations here on the podcast, with the saying, if you're not paying for the product, then you are the product. But in the context of privacy, tech companies often lure users with convenience and then promise to resolve any privacy complications by giving that individual user control of how and what information the company will collect, or so they say.
Woody Hartzog (12:47):
I have seen this time and time again in congressional testimony and marketing materials. And it's not just the tech CEOs. If you were to ask many people in government what the goal of information privacy law is, they will tell you, "We want to put people in charge of their informational destiny. We want to give them control over personal information." And if you look at the laws, this is bearing itself out. The California consumer privacy act, which many people have heard of outside of California, certainly most people within California have heard of, is essentially an entire set of rules based around the idea that you should be in control. You should be fully notified about what companies are doing with the information and you should have the ability to control it. And this, I think, is a misguided approach to privacy in the modern era. We think we have it, but what is the way in which control is manifested on the internet? It's through user interfaces.
Carol Rose (13:50):
And by giving us what seems like control over our personal privacy, we, the users, feel empowered, both by your sense of control and in our relationship with the platform. Again, that's how it seems, but perhaps that's not the way it actually is.
Woody Hartzog (14:04):
Think about the last time you logged onto a website or an app, and you went to your privacy settings. Were you able to fill out a text box that says, "Dear Facebook, I would not like for you to use any of my personal information in any way. Don't show me ads and don't target me and don't collect my geolocation." And then you were able to press in, and Facebook would be like, "Sure, we'll respect that." That would be meaningful control, but of course, that's not what happens. Instead, what happens is we are given a overwhelming array of little knobs and buttons and drop down menus for us to choose from, and we only get to select the choices that were pre-scripted for us, right? In other words, any decision that we make is totally fine with the technology companies, because they're the ones that created the decisions, right?
Woody Hartzog (14:56):
They're going to get whatever they want, no matter what. And so control is a losery. The second problem with control is that it's overwhelming. The idea behind control is that you will have some sort of agency in your decision, that your autonomy would be respected. So, we see this happen, for example, when you open up Google maps or Pokemon Go, and the little box pops up that says, "Can Google maps use your geolocation?" And you say, "Yes, I agree to this happening, and you should use it." And you've given your permission. You have control over your information. Privacy has been respected. Everything is good, right? Wrong, because that only looks at that decision in isolation. But that's just one data point for one app, and we use tens to hundreds of apps every single day that collect hundreds of different kinds of information. And so if we were to really meaningfully have control over our personal information, the only thing we would do all day is give permission to apps to use our personal data, but okay.
Carol Rose (16:13):
So we, the users, have the illusion of control over our privacy and the power to control how our data is used, but in reality, there was never any meaningful control. And that's not all. And here's our misconceptions about privacy feed into a larger system of social control that we don't even see.
Woody Hartzog (16:32):
And the third reason that control is the wrong way to think about privacy is that it is myopic. When people exercise control over their personal information, they're not thinking about how is this information going to be used to train an algorithmic model that will be harmful for other groups of people, perhaps people that are more vulnerable than you are, right? We know that AI is biased against people of color, generally, particularly facial recognition algorithms. We know that people of color feel the brunt of surveillance based algorithmic products first and hardest. But when someone comes to you and says, "Can we collect your personal information to train our models or whatever?" a lot of people probably don't think about that. They think instead, what's going to go wrong for me individually, right? And what we get then is the overall result of millions of self-motivated decisions. Not what's best for society, not what's best for our democracy, not what's best for equality, but what's in it for me.
Woody Hartzog (17:44):
And that normalizes all kinds of practices that ultimately are corrosive to our collective wellbeing. And so I would just say that for all those reasons that control is illusory, it's overwhelming and it's myopic, it is the wrong way to think about privacy, and it's the wrong version of privacy to build our rules around. And unfortunately, that's what we've done.
Carol Rose (18:13):
Mistakes have been made, to say the least, in policy and lawmakers' approach to digital privacy. In part, that's because policies and laws too often are enacted only as a result of a panic or a tragedy. And with surveillance technology and its applications, we've already seen how one day in 2001 changed the fundamental direction of our approach to surveillance and privacy.
Woody Hartzog (18:36):
I was in law school on September the 11th, 2001, and literally physically in the building and saw the towers come down with my classmates. And like most people, it fundamentally affected me. And what really started to affect me was what came after as well, which was the USA Patriot act and the incredibly broadened surveillance powers of the governments that, in my opinion, was being both unjustified and used and oppressive ways, particularly against vulnerable people. And that's when I decided that I wanted to do privacy as a career, because one of the things that I've come to believe, and one of the narratives that I'm hoping to fight, is that technologies is not inevitable. Technologies are created by people, laws are created by people, and we can absolutely mitigate those harms and reverse the trends. And there are technologies that, with meaningful public support, can change almost overnight, that we know that there are ways to improve technologies and improve the rules that we have for those technologies.
Carol Rose (19:50):
That's one of the unspoken truths about technology and law. At the end of the day, technology isn't really to blame. And it's really up to us, we, the people, to insist that businesses and governments regulate how we use technology to enhance rather than diminish liberty. And to do that, we need to bust the myths that arise around new technology.
Woody Hartzog (20:12):
There is another misconception about facial recognition technologies that I hear a lot, not just in industry, but also in legislative circles and law enforcement circles, and even among people that aren't thinking about this 24/7 that I talk to just socially, which is it that the reason that facial recognition technology is bad is because it is inaccurate. Now, for sure it is not great when you deploy a tool and it only works 30% of the time, right? If it messes up more than it works effectively, depending on the error rates, then it's worthless, right? And we also know that it's biased against people of color, it's biased along gender lines, that there are lots of different reasons to be concerned about the accuracy of facial recognition technologies. But here's the misconception. Just because it gets more accurate doesn't mean the problems of facial recognition technology will be diminished.
Woody Hartzog (21:17):
In fact, the more accurate it gets, the more dangerous it becomes. And that's because law enforcement and industry and everyday people who want to harass or spy on others will begin to use it more. Because at base, facial recognition, technology is a tool of control. It is a tool that people use to control other people, to keep them from doing something that they don't want them to do, or to make them do something that they want them to do. That's ultimately what facial surveillance is deployed for. And the more accurate it gets, the more oppressive it will be, particularly for vulnerable communities.
Carol Rose (21:59):
That's scary stuff. And it's important to recognize why it is so unsettling. But as with any civil rights or civil liberties issue, the impact goes far beyond one person, which is one reason it's so frustrating. When lawmakers and tech companies insist that it's enough to put privacy controls into the hands of every individual user without taking into account the systemic impact of those individual decisions, makes you wonder how do we collectively gain any real control here?
Woody Hartzog (22:29):
The short answer is the law has to get serious about holding companies accountable and keep them from foisting all of the responsibility for protection onto individuals, because this is the framework that we have now, not just in data security, but also in privacy. And it's fundamentally misguided, and it's not working. We need lawmakers to make sure that people are protected no matter what they choose, whether they click the, I agree button, whether they accidentally click on that link, whether they don't have a full understanding of exactly how the particular sort of data sharing settings work. We need a much more rigorous approach to holding companies accountable. And there are best practices. We actually have a general set of rules in privacy and data protection that's referred to as the fair information practices. And there are tons of data security standards that, if adhered to robustly, would also work really well, but the law is not holding all of the actors accountable.
Woody Hartzog (23:38):
It focuses on the breached entity. It focuses on things like consent. And finally, we need an embrace of human values in a way that would require companies to consider what's best for people when they make these decisions. Because right now, the rules are such that if you consent to it, or if you follow a sort of a bare bone set of checklists, as a company, you can do it. There's no requirement that you act in the best interests of the people who entrusted you with their personal information. I've advocated strongly to ban facial recognition outright. I've worked with legislatures at the state and federal level to try to implement a duty of data loyalty. I worry about the ways in which these technologies are going to be used against marginalized populations, people of color members of the LGBTQ plus community.
Woody Hartzog (24:42):
We know that they will feel the brunt of these technologies first and hardest. And unless we act soon, what's going to happen is what always has been happening, which is there will be a outrage for a little while, and then government and industry will essentially outlast it. And then they'll just impose the thing they always wanted to impose in the first place.
Carol Rose (25:05):
By normalizing facial recognition with convenience features like facial scanning to unlock a smartphone, these companies are hoping that we won't notice that they're infringing on our rights. But we have noticed, and people like Woody Hartzog are here doing work to make sure we don't lose sight of the core issues, which is to say, it's not a hopeless situation.
Woody Hartzog (25:27):
If we become aware of the role of normalization and we become committed to ensuring that abuses of power from data don't happen, we can still change this. I remain optimistic, even though I come across as pessimistic, probably here and to my students and everyone, but we can do this. There are signs of hope that we see, I think, all over. And I'll continue to hold onto that as we move forward.
Carol Rose (25:56):
So, we can do this. Right, Kade?
Kade Crockford (26:00):
Yeah, I believe we can. And in the ACLU's work, one of the central issues that we deal with on matters related to technology and civil rights is research. We have to uncover those oppressive uses of technology, and then we have to inform people about them, to make folks aware of the current threats, to civil rights and civil liberties.
Carol Rose (26:23):
Yeah, but some people read the headlines about corruption and privacy threats and other actions by bad actors. As just maybe another drop in the deluge of negative news, it can feel overwhelming.
Kade Crockford (26:36):
Yeah, no question. We don't want people to just read the headlines and think, "Oh, this is absolutely despairing. There's nothing we can do about this," the kind of like privacy is dead, oh, well, shrug and move on approach. Absolutely not. It is a tough line to walk, but the way that we address that problem is by not only uncovering the information, but has an organized coalition behind it so that we can ensure we address the problem in the law. Some years ago, in 2018, we were preparing to launch our press pause on face surveillance campaign. And we realized that one of the key problems was that we didn't know what was going on with government use of facial recognition technology in Massachusetts, and that's because the police don't exactly advertise publicly that they're using novel and invasive surveillance technologies.
Kade Crockford (27:30):
So, one of the first things that we did was file a public records request. Our colleague Emiliano [inaudible 00:27:37] was in charge of that and filed hundreds, literally hundreds of public records request with police departments across the state in cities and towns of all sizes, asking for information, essentially, about those government agencies use of facial recognition technology, but not just that, also asking for information about how companies have marketed those technologies to those police departments. And we found some pretty astonishing information, particularly in a community that some people may be familiar with, Plymouth, Massachusetts, the Plymouth rock landed on us, Plymouth, Massachusetts. And it's like a town of 50 or 60,000 people.
Kade Crockford (28:21):
The police chief there, we discovered, thanks to this public records request, had been corresponding for a while with a surveillance technology company owner, this guy named Jacob Sniff, who established a company called Suspect Technologies. Jacob Sniff, and this police chief Michael Botieri had been talking for years and were very close to launching a face surveillance initiative in Plymouth that would have resulted in the use of suspect technologies, highly inaccurate facial recognition technology in real time to analyze live data from the surveillance cameras, from the lobbies of the public school, the library, the police department, and other town government buildings.
Kade Crockford (29:10):
And the plan was to put so-called suspicious people or wanted persons images into this system. And to have that system alert the police department when those people walked inside a public space. This was despite the fact that Jacob Sniff was acknowledging, in these communications, that the technology would likely produce numerous false positives and would produce a number of errors. Not only that, Jacob Sniff knew that in order to really beef up his company's capabilities, he would need access to a government database that contains the names of people and their personal information connected to their faces. He was trying to get the police chief to help him convince the registry of motor vehicles in Massachusetts, the state agency that does licensing and state identification, to hand over the state's license database to this private company, and the chief of police was trying to help him. Now, thankfully, the record showed that the RMV told this guy to go kick rocks, "No, we're not going to hand over the state's driver's license database to this random guy who has this small surveillance technology company."
Kade Crockford (30:25):
But that's the kind of thing that was happening behind closed doors before we discovered it. So when we discovered that, we gave those materials to a reporter, Joseph Cox at Vice News. He published a story about it. And when he called the police chief and said, "I have these records that the ACLU gave me showing that you're eager to basically let this company train their algorithm, using data from people of Plymouth without their knowledge or consent. This technology, the guy who is trying to sell it to you is admitting that it doesn't work very well. What are you thinking here? Have you gotten permission from the mayor? Have you talked to members of the public about this?" The response was, almost immediately, "Actually, we're not going to do that." That was one of the key stories that animated the launch of our campaign in 2019. And in the intervening period between then and now, we've been really successful in putting a stop to police use of face surveillance technology throughout Massachusetts, in many communities and in regulating it in others.
Kade Crockford (31:28):
It is also the case though, that by obtaining information about what's going on, by disclosing that information, and by doing so within and organized and orchestrated campaign backed by dozens of civil rights and civil liberties organizations, we can make a real change.
Carol Rose (31:49):
Fortunately, there are a handful of policy makers paying attention to the rise of the surveillance state. Massachusetts' own Senator Ed Markey recently sat down with me to talk about his efforts to keep privacy violating technology in check. Senator, thank you for joining us today.
Senator Ed Markey (32:07):
Well, it's great to be with you, Carol.
Carol Rose (32:10):
Let's start with a quick overview, as you see it, of the legal landscape of surveillance and privacy policy in the United States.
Senator Ed Markey (32:17):
As I got to Congress and I could see the evolution of technology breaking down all of these boundaries and people's lives, I naturally gravitated towards those issues. The truth is that there is a... Well, there's a duality to these technologies. It's a tale of two technologies. They can enable, they can and ennoble, or they can degrade or debase. The companies, they tout all the good things, but ultimately, it's going to have to come to the government to have to make sure that we police, for our families, all the bad things that can happen. Privacy protections in America are... They're behind at times. I have worked hard, over the years, to ensure that there would be a way of policing the use of information. Back in 1999, I actually created the congressional privacy caucus so that we could focus on these privacy issues, these invasions of information that have occurred since the dawn of the internet. And each year Americans experience more and more threats to their privacy. Because of new technologies, there is a way to penetrate into the inner sanctum of people's homes, into people's lives.
Carol Rose (33:36):
Some of our guests so far have talked about the normalization of invasive technologies and how that process happens over time. Since you've been involved in legislating invasions of privacy since before the internet reshaped the world, how would you compare the situation today with that of the past?
Senator Ed Markey (33:52):
In the old days, when people just had a black rotary dial phone, no one knew who called you. No one knew what you said on the phone. And there was kind of a zone of privacy that everyone felt that they had in our country. And the police couldn't get into your home phone unless they got a legally obtained warrant from a judge. Well, in this new modern era, there is a capacity, both by the private sector and by the government, to be able to break in, unbeknownst to an individual, and to just be consuming all of your data for their purposes, and to do so in an indiscriminate way. And every year, Americans experience more and more threats to their privacy, from airports, to schools, to shopping centers, data collection technologies, like facial recognition tools, are creeping into so many parts of American life at an alarming pace.
Carol Rose (34:51):
So right now, what do you see as the biggest obstacles to modernizing electronic privacy laws in Congress?
Senator Ed Markey (34:57):
Well, again, I think it's twofold. One, that the business model of the internet is compromising the privacy of every single one of us. And on the other hand, the government wants to have an unfettered or a very loosely regulated right to be able to break in, into gather information about individuals. So right now, the American people don't have a federal law on the books that protects the public from the privacy threats of facial recognition technologies, for example, used by the government or by law enforcement. So, we have a gaping hole in the law, and an egregious lack of enforceable privacy protections. And in the meantime, these technologies are being used more and more every single day.
Carol Rose (35:48):
I'm so glad you brought that up. And thinking about where the legislation originated, is it connected to the ACLU's public test of Amazon's facial recognition technology?
Senator Ed Markey (35:58):
Well, I was concerned about this technology, but then on July 28th, 2018, the New York Times had a big story. And it just leads by saying that representative John Lewis of Georgia and representative Bobby Rush of Illinois, both Democrats members of the congressional black caucus and civil rights leaders, the facial recognition technology made by Amazon, which is being used by some police departments and other organizations, incorrectly matched the lawmakers with people who had been charged with a crime. Well, one of the 28 people out of the 535 members of Congress who was misidentified, I was one of them. Clearly, members of Congress who were black and brown were disproportionately identified, and that's a big problem, but they had even misidentified a Caucasian Irish American from Boston.
Carol Rose (36:53):
Right. So, there's a disproportionate impact on people from historically marginalized communities, and yet the threats potentially affect all of us. So in your opinion, Senator, what are some of these threats, especially regarding civil rights and civil liberties?
Senator Ed Markey (37:09):
Very simply, these technologies could eliminate public anonymity in the United States forever. They are capable of fundamentally dismantling Americans' expectation that they can move, assemble, or simply appear in public without being identified or misidentified. In 2018, again, I was misidentified, along with John Lewis, along with Congressman Bobby Rush. It can happen to anyone, but especially black and brown citizens of the United States and immigrants to our country. We have to be laser focused on how big tech's behavior disproportionately harms communities of color. Self-regulation is a failure. We have to pass laws. That's what I'm working to do. I'm going to work very hard, in the coming months, to pass my facial recognition legislation, along with other privacy protections. I am very proud to be leading the fight in Congress to put a stop to this invasive and racist technology.
Carol Rose (38:24):
It's great to hear that at least some lawmakers understand that we need to pass new laws to protect us from the threats posed by surveillance technology. And as Frederick Douglas said, "Power concedes nothing without a demand." So, I think we need to create new ways for voters to really understand the threats here and to feel empowered, to demand that our elected officials actually take steps to pass laws, to protect us from the threat of this growing surveillance state. Kade, can you say something about what we're doing here in Massachusetts?
Kade Crockford (38:56):
As we discussed earlier in the show, the revelations that we discovered about what was going on behind the scenes in Plymouth were really helpful in helping us to mobilize activists and organizations to join our campaign. State lawmakers pay attention to what goes on in their own backyards. A lot of city councilors and mayors eventually become state level public officials. So, there's a lot of attention that state lawmakers pay to what their own city councils and town governments are doing. And something like banning facial recognition, I think, caused enough of a stir and sort of snapped the attention of state lawmakers. And I think people across the country should take note of that lesson too, that local action, which is much easier to accomplish, in many cases, than getting something moved at the state level can have that ripple effect, can give state lawmakers the courage, and show them that it's possible to do something really meaningful on issues at the intersection of technology and civil rights. As Senator Markey said, the burden of who is harmed by police use of face surveillance technology is not born equally either.
Kade Crockford (40:08):
That's because face surveillance in the hands of police is just another explosively intrusive and invasive tool that's being used and will be used disproportionately to police black and brown people and communities. If you live in a leafy, white wealthy suburb, you probably don't have a surveillance camera on every corner in your neighborhood. But if, like me, you live in a predominantly black or Latin neighborhood in a city like Boston, there are police surveillance cameras everywhere. And so that's just one example of how, if we don't stop it, face surveillance technology will be applied differently to different communities, just like every other surveillance tool in the hands of the government. And honestly, that's to say nothing of the bias and the errors that can be baked into these technologies themselves.
Carol Rose (41:02):
So, join us next time on Freedom Unfinished, where we'll explore the disproportionate impact that AI has on people from black and brown communities, the human biases that are built into algorithms that drive modern technology, and the tangible human effects of technology on all of us. Until then, I'm Carol Rose.
Kade Crockford (41:21):
And I'm Kade Crockford.
Carol Rose (41:23):
And this is season one of the ACLU of Massachusetts Freedom Unfinished: Decoding Oppression.
Carol Rose (41:33):
Freedom Unfinished is a joint production of the ACLU of Massachusetts, and Gusto, a matter company, hosted by me, Carol Rose and my colleague at the ACLU of Massachusetts Technology for Liberty program, Kade Crockford. Our producer is Jeanette Harris-Courts, with support from David Riemer and Beth York. Shaw Flick helped us develop and write the podcast, while Mandy Lawson and Jeanette Harris-Courts put it all together. Art and audiograms by Kyle Faneuff. And our theme music was composed by Ivanna Cuesta Gonzalez, who came to us from the Institute for Jazz and Gender Justice at Berkeley College of Music. We couldn't have done this without the support of John Ward, Rose Aleman, Tim Bradley, Larry Carpman, Sam Spencer, and the board of directors here at the ACLU of Massachusetts, as well as our national and state ACLU affiliates.
Carol Rose (42:24):
Find and follow all of season one of Freedom Unfinished: Decoding Oppression, wherever you get your podcasts, and keep the conversation going with us on social. Thanks to all of our guests and contributors. And thanks to you for taking the time to listen. It's not too late to mobilize our collective willingness to act and to ensure that technology is used to enhance rather than diminish freedom. See the show notes to discover ways to get involved. And always remember to vote, and not just nationally, but locally too. Together, we can do this.