Freedom Unfinished
Freedom Unfinished
E1: Private Sector Data Economy
In the first episode of Freedom Unfinished, we ask what is "big data" and how is information fed into the data capitalist economy? How have private industries acquired all this personal data and what are they currently doing with it? How has a rush to "turn the information into insight" set the stage for sobering lessons about the limits of these tech-powered conclusions?
Get the answers to these questions and more by listening to ACLUM executive director Carol Rose and Technology for Liberty program director Kade Crockford explore big data and artificial intelligence through the lens of power, democracy and the broken systems that will determine the future of our rights.
Join us this season wherever you get your podcasts and follow the ACLU of Massachusetts on social media @ACLU_Mass for the latest updates on Freedom Unfinished, Season 1: Decoding Oppression.
Thank you to Trent Toner and the Rian/Hunter Production team for sound mixing this episode.
Carol Rose (00:03):
We launched the Technology for Liberty program at the ACLU of Massachusetts back in 2003. It was a response to the sudden and really unchecked expansion of government surveillance and police powers in the immediate aftermath of the 9/11 attacks. At the time, the internet was actually still in its infancy, but we really quickly saw this development of new surveillance technologies and data mining by the government as computing power and speed increased exponentially, and the price of data storage dropped dramatically.
Carol Rose (00:32):
At the same time that we saw this historic expansion of the national surveillance infrastructure at both the federal level, we also saw it at the state and local level. The newly created Department of Homeland Security and similar new federal agencies were suddenly transferring billions of dollars in federal money to state and local police in exchange for their cooperation and tracking more and more people. And it soon became clear to us that not only was big brother watching us, but so was little brother. And these private entities, these tech companies, they weren't subject to the same kinds of checks and balances as government entities. And yet, they soon began to collaborate with, and actually profit wildly from, government agencies that were paying them to use technology and data to track and control people.
Carol Rose (01:18):
We know throughout history that government abuses of power often start by targeting people who historically lack power. And this time is really no different. Most recently, the Supreme Court's decision to overturn Roe vs. Wade after 50 years creates yet another threat. Since data surveillance left unchecked can be used to track and criminalize people who are seeking reproductive healthcare as well as other forms of healthcare. The expansion of data tracking also deepens discrimination against people and communities who historically have less power to fight back. They're the canaries in the coal mine. But the threat to civil rights and civil liberties actually endangers all of us and it threatens our democracy. With the rise of artificial intelligence and machine learning without any adequate checks and balances, the technology now threatens humanity itself.
Carol Rose (02:06):
As the nation's oldest civil rights and civil liberties organization, the ACLU is sort of uniquely poised to respond to many of these threats. We've spent more than a hundred years fighting in the courts, in Congress, and state legislatures, online and in the streets to protect human rights and democracy. Our fundamental mission hasn't changed even as the threats posed by big tech and government surveillance have grown. Our Technology for Liberty program brings together technology experts with legal advocates and the power of ordinary people who care about democracy and liberty in an effort to ensure that technology is used to promote rather than to undermine fundamental rights and liberties.
Kade Crockford (02:50):
I'm struck listening to you tell that story, Carol, by how shockingly relevant all of this work is now in light of the criminalization of abortion in half of the states in the US. For as long as I've been doing this work, people have asked me, in so many different contexts, why privacy matters. Another variation of that is, what do I have to hide if I'm not doing anything wrong? And I think that in light of the Dobbs decision particularly, people just get it immediately. People are now realizing that what we've been saying for a long time is true. Privacy is central to human dignity. It is central to democracy. And unfortunately now, it is also central to reproductive autonomy in half of the country.
Kade Crockford (03:36):
So just as an example, police are going to be the ones investigating crimes related to abortion care. And we know, as you said, that police have unprecedented access to information about all of us today. Thankfully though, we at the ACLU and particularly in Massachusetts and some other states have been doing the work to understand exactly how police are accessing data and conducting surveillance, and strategizing to effectively fight back in the courts, in the legislatures, and in city governments. And this is the work that's at the heart of the Technology for Liberty program and is a lot of what we'll be talking about during this podcast.
Kade Crockford (04:16):
For example, when I first came to the ACLU of Massachusetts, I was looking at documents our legal department had obtained from the Commonwealth Fusion Center from the Massachusetts State Police. My mind was blown wide open in part because I thought that I was a pretty politically aware person who followed the news and paid attention to current events, even at the local level in Boston and in Massachusetts, but I'd never heard the words Fusion Center before. And I had no idea up until that moment that, for five or six years at that point, the federal government had been funding, to the tune of billions of dollars, a massive expansion of the national security state down to the municipal and state level. And that Fusion Centers were a part of this, which was one of 70 something spy centers that were established with funding from the newly established Department of Homeland Security in the early 2000s, ostensibly, to address one of the recommendations in the 9/11 commission report that state and local law enforcement be more integrated with the national intelligence architecture.
Kade Crockford (05:25):
And so that was how I started thinking about issues related to technology and civil rights because attendant to this expansion was a huge amount of money coming from the federal government for state and local police departments to acquire new technologies, new database systems, subscription access to all kinds of databases that would allow them to look up information about anybody, not only in Massachusetts, but in the country. As well as pretty sophisticated surveillance technology equipment enabling them to collect their own information about people in the public and to analyze and share that information across local, state, and even federal government agencies.
Kade Crockford (06:12):
So my entry into this work was thinking about the expansion of the national security state down to the local level. And I just kind of became obsessed I guess. I was so shocked that this was going on and kind of decided we got to do something about this. Nobody's paying attention to this in Massachusetts. We need to tell the press and find out more about what's going on, and we did.
Carol Rose (06:36):
It's so great to hear that story. I mean, that shows what a difference 10 years makes. And during that decade, it was the post 9/11 era, and everyone was acting out of fear instead of rationality. And so there was a dramatic expansion, the creation of the Department of Homeland Security that didn't even exist before, the creation of ICE, Immigration Customs Enforcement, that didn't exist before, the efforts to round up people simply for being Muslims. And to begin to do things like to track people when they went onto the subways. And so this kind of expansion of the surveillance state from 2003 to when we actually officially launched the program in 2013, and now here we are, yet another decade later, really talking about a complete change in the way that data capitalism has arisen in this country. Really shows how quickly technology can change our civil rights and civil liberties.
Kade Crockford (07:26):
I'm hopeful that that's what we're going to get into on this podcast, why it's so vital to civil rights and civil liberties for us to get some democratic control over how these data systems are used and how our information is manipulated both by corporations and by the government.
Carol Rose (07:43):
Knowledge is power and knowledge in the hands of the people gives the power back to the people. And that's why we want to bring this podcast to people who are listening. In the face of unprecedented threats to our democracy, Freedom Unfinished will explore with you ways in which each of us, both individually and together, can protect civil rights and civil liberties in the digital era. We'll unpack the rapid evolution of technology and how the age of information can both enhance civil liberties and also put our rights at risk. We'll ask ourselves and our guests, what can we do to ensure that technology is used in the service of liberty rather than to take rights away? Simply put, how do we protect individual rights and democracy in the digital age?
Carol Rose (08:27):
Now, a quick caveat. At any given time, it's impossible for a single piece of content to capture everything that's going on in the world of civil liberties or technology, or to encompass the entire scope and scale of our changing political landscape. But you'd be surprised by how many topics, like reproductive justice is just one example, are tied to the changes in technology we all face. We've done our best to connect the dots here in this podcast and expect updates on this feed and on social media, as technology and a political landscape evolve. And so now you know why we're here and the journey that we're about to take together. I'm Carol Rose, executive director of the ACLU of Massachusetts.
Kade Crockford (09:10):
And I'm Kade Crockford, director of ACLU for Massachusetts Technology for Liberty program.
Carol Rose (09:16):
And welcome to the Freedom Unfinished, Season 1: Decoding Oppression.
Carol Rose (09:32):
Information, technology, privacy, these are all core tenants of our current data economy, but the movement of information isn't unique to the digital age. There's a long and fluid history of information, the law and in the United States free speech and a free press. So I reached out to Martha Minow, professor and former dean at Harvard Law School who's just written a book called Saving The News. It's about how social media impacts freedom of speech and the press, and the impact of technology in mediating our relationship with truth itself.
Martha Minow (10:05):
Hi, it's a pleasure to be here.
Carol Rose (10:08):
So Martha, connect those dots for us. Freedom of speech and the press, the internet technology, the mediation of truth.
Martha Minow (10:16):
We're living at a very challenging time on so many dimensions. And a lot of it has to do with the internet and the internet's impact on the market for the advertising. The very big social media platforms that of course are big in part because the network effect is part of their business model. The longer a person is engaged, the more ads they can sell, and so their goal is to keep a user engaged.
Carol Rose (10:48):
But between the ads, users of social media also see opinions, pictures, ideas. In some ways it feels aligned with what we'd expect from traditional news sources, ads included.
Martha Minow (11:00):
Traditional news media was never perfect, but if it were a news service, I'd feel much more sanguine about it, but it's not news. It's whatever is being put out there. And right now in this country, three quarters of Americans say that they get their information from social media, which is either their friends or somebody who's a friend of a friend.
Carol Rose (11:23):
Right. But historically, there have been regulations on news as an industry. That doesn't seem to be the case with social media platforms. Is this a free speech first amendment issue?
Martha Minow (11:34):
I do think that there are rights of access to information that the first amendment recognizes. And one kind of information that none of us are really getting is transparency about how the tech companies are moderating. And why are we seeing some things and not other things? So that's a long way of saying that an awful lot of the discussions right now about what speech is protected and not is not really about constitutional law. It's really about the practices of private companies.
Carol Rose (12:04):
Which goes back to the idea of the private sector data economy, and what amounts to data capitalism. Ultimately, who owns people's personal information and how are they allowed to use it?
Martha Minow (12:15):
I am not opposed to the use of data, but I am concerned about business models that are predicated on taking people's data without actually letting anyone control or see what's taken without actually giving anyone recompense for it. I think that there's a danger of surveillance society to everyone's freedom. That includes the free press, but also includes individual privacy. The jeopardy to freedom of press and freedom of speech I think really comes from who has access to this information. And if the government gets access, which it often does, it can buy data sets, even though the data was never developed with that in mind and not consented, released with that in mind. That's very troubling to me.
Carol Rose (13:10):
My conversation with Martha left me with a lot to think about. I mean, we all kind of know what's going on with these tech companies. We have some sense of what's happening with our data, which means, well, we're all at least a little worried about existing in online spaces, public or private. I know I am. Where do we go from here? What comes next? From a legal perspective, legislating private companies is one thing, but when governments are active participants in the economy of data, what's the incentive to install meaningful guardrails? Who really holds the cards here?
Julia Angwin (13:44):
The companies that collect the most data about human behavior have the most power. You see that with Google, Amazon, Facebook. These companies have used their power that they have gotten through collecting information that was previously uncollectible.
Carol Rose (14:01):
That was Julia Angwin. Now some of you may already know her or follow her work with ProPublica and the Markup, or even just her Twitter, which I highly recommend. Julia has an uncanny gift for explaining complex things in ways that make them easy to understand.
Julia Angwin (14:17):
And so this conglomeration of data about human behavior that these tech giants have is something that was an asset that really didn't exist before. And they've figured out how to monetize it. Amazon is a very good example of this, right? They started as a place for third parties to sell their wares, basically a platform. And then they decided we could sell this stuff. And they figured out using this data what were the most popular things. They made them themselves, and then they boosted them in the ranking so that's the first thing you see when you search.
Julia Angwin (14:48):
And so data has become this way that you can basically provide barriers to entries. Your competitors don't have that much data, so they can't break in. And it also means you have pricing power because you know what people are willing to pay. You can always just inch it up a little bit. And as you drive out your competitors, you have much more ability to charge higher prices. And it means also that you are really untouchable. You said this is an unregulated area, which is always a great thing for capitalism. They prefer to be unregulated. And then capitalism also prefers monopolies. And so we have seen these monopolistic behaviors from these companies.
Carol Rose (15:23):
Which feels a bit like big business as usual, except it isn't because instead of selling products, they're selling data about people, about all of us. So the standard understanding of free markets doesn't really apply here because the monopoly isn't about limiting variety or undermining competition so much as it is about the ability to own massive amounts of personal information on us that can be used in any way a buyer wants or the market needs.
Julia Angwin (15:50):
It's a really tragic moment because we see that the impact of this algorithmic surveillance is at first, it seemed like, oh, it was just going to be creepy ads. Maybe these socks are going to follow you around the internet. What's the harm in that? But the harm is actually much bigger than that. The harm is that we have defunded the watchdogs of democracy because if there's no press to hold our powerful institutions to account, are we a democracy? Because if you have no oversight, then the public has no oversight. And the whole point of democracy is the public has oversight over the government.
Carol Rose (16:24):
And the lack of transparency in these systems is difficult to understand, let alone regulate.
Julia Angwin (16:29):
I call it data exploitation, but it's commonly known as privacy. And at that time, the Snowden revelations had come out and I started to feel that people were saying, "I just don't care. It's not going to matter to me." And I thought, I need to shift from the creepiness of the data surveillance into how the data is being used against you, because that's what people need to hear to understand this issue. The only way to really understand what these complex technical systems are doing is to audit them, to basically take a snapshot of what their impact is on the world, who is getting what types of ads or who is getting certain search queries, and do a systematic statistical analysis. And show that it's not just an anecdotal thing that oh, one person is seeing a lot of content of this type. But in fact, it's systematically the algorithm is creating that for everybody.
Julia Angwin (17:29):
And that's the challenge because journalists and people who do oversight have tended to be focused on individual harm. And this is something where one anecdote isn't enough. Basically, there are not people independently testing these systems. So if you were to build an airplane and try to go sell it in the commercial space, you would have to get it inspected. And the FAA would do an inspection and you have to meet some safety criteria. Same thing if you build a car. These software systems aren't as complex or more than cars and planes, and they do have incredible human impact. And there is no sort of system of independent oversight. And so there's basically a way that we are all the guinea pigs.
Julia Angwin (18:18):
A classic example of this that everyone has experienced is Facebook. They just change their algorithm. And then all of a sudden you don't see any things from your grandma. And all you see is right wing content. And so they just adjust the dial and then we all have to be like, "Whoa, what happened?" And sometimes they realize they've gone too far and dial it back. And sometimes they don't. And so the conversation on oversight is really the public trying to say, "Hey, I want some control over this. Can you adjust it?" But the public doesn't have that much knowledge about what to ask for, how to ask for it. And the companies don't have real obligations to listen to them.
Carol Rose (18:59):
So Kade, really haunting stuff from Julia Angwin. I'd love to get your reaction. What do you think?
Kade Crockford (19:06):
Yeah, I was fascinated because the examples that Julia gave of an airplane or a car, these are interesting to me because they differ in one central respect from the services companies like Google and Facebook provide to their users. And that difference is simple, but it's extremely significant. When we buy a car or when we buy a ticket to ride on an airplane, we pay money in order to receive those goods or services. But the entire 21st century internet is built on a totally different business model. As the saying goes, if you're not paying, you are the product.
Kade Crockford ( (19:45):
So what does that mean? It means that we can use Google services without paying any money. You can do a Google search or use the Google Maps app, but instead of paying for those services in money, we pay for them with our personal information. And Google makes billions of dollars every year off of that personal information, because it's extremely sensitive. And because it's so sensitive and so revealing, these companies have convinced marketers that it's the best way to peer into our souls and to sell us more stuff more often. So for me, the big questions are, is this a good deal? Is that a fair exchange? Is that how we want the internet to function? I don't know.
Carol Rose (20:29):
And so the question is, is there a way that we can have any personal control over the data that these companies are taking from us and exploiting?
Kade Crockford (20:37):
As Julia was saying, one of the key problems here is that these companies are enormous. They're extremely wealthy and they do a lot of this stuff in ways that are completely opaque, even oftentimes to regulators. So a big question for me is even if we have the best possible why in place to limit how these companies can collect and process information about us, how do we enforce that?
Carol Rose (21:05):
Yeah, it's really chilling just to think about the fact that they could track my movements or my face or my keystrokes and all those kinds of things. And I don't even know about it.
Kade Crockford (21:15):
I'm so excited to hear from Tim Hwang because his book is about whether or not this whole data economy that undergirds the 21st century internet is actually producing the value to advertisers that these big companies like Google and Facebook claim that it is. And he has a theory that the whole thing is a house of cards, which I think raises really interesting questions for regulators.
Tim Hwang (21:46):
And this is, I think in some way is the great debate, right? Because I think that a company like Google would say, "Well, we take the data that we gather to provide more value to you, the user."
Carol Rose (21:58):
So that's Tim Hwang. He's the author of Subprime Attention Crisis. It's a book about online advertising market bubbles. And we're going to talk about that bubble a little later, but let's turn to Tim on the data being gathered in these markets on each of us.
Tim Hwang (22:12):
When you chart a path on Google Maps and it says, "Oh, you got to take these left turns and these right turns to get to a destination," it will now estimate how long it takes to get to a place. Well, how does it figure out whether or not that's the case? Well, it has a lot of people who are using Google Maps that are moving from place to place, and it can use that data to give you recommendations. Right? And then obviously, that's the talking point they use because it's a nice feature. You actually benefit from that.
Carol Rose (22:38):
For anyone who's used a navigation app on their phone, this is a common benefit. You're never lost, at least not in terms of location.
Tim Hwang (22:45):
The negative of course, is that there's no controls on what you can do or can't do with the data until fairly recently. Right? And in effect, some people would argue, and I'm sympathetic to this argument, which is basically that, well, maybe that data is actually used to monetize through advertisements. So Google turns around and says, "Well, look, I've got data on this guy, Tim Hwang. He's 25 to 35, living in the Northeast of the United States. Who out there would pay me to get a message in front of him?" And I think some right skepticism about that, about where the loyalties are of a company that's built on that kind of business model, because obviously, it collects data from one place but gets its money from another place.
Carol Rose (23:26):
Which is the idea of data capitalism that we've already begun to tease out, but it doesn't stop there.
Tim Hwang (23:31):
There was another really fascinating case, I think last year where Uber basically said, "Look, we were spending about a hundred million on ads, digital ads, and we stopped it and discovered nothing happened. And our only conclusion is that most of that money was just lost to fraud in the system." And these are sophisticated companies, right? And I think one of the funniest critiques of the book, at least in my opinion, was people who were like, "Well, Tim, people wouldn't put money into the system if it didn't work," but we have all these cases in which people are clearly putting money into the system and it doesn't work. And I think that there are a lot of these structural problems that really bring into question what actually is the value for companies when they put their money in.
Carol Rose (24:11):
Unlike selling tangible products, data capitalism deals in personal information and how that information gets used, well, that's not really within the market self-defined purview. But it begs the question, is this the only fraudulent use of data in the marketplace?
Tim Hwang (24:27):
The famous case about this is Cambridge Analytica, where people said, "Oh, well, did you know that people got this data from Facebook? And they were able do these psycho-analysis around data and deliver ads that were able to flip the Brexit vote." Well, the UK privacy regulator came out with a report just a few years back saying, "Well, look, we did a really thorough autopsy of the situation. And while this was an enormous privacy violation, we can find no evidence that these ads had a material effect on the Brexit vote." I think you should be genuinely really skeptical when someone says, "You should be worried about the mind controlling influence of online ads." Because what we really find is a system that is obviously hugely privacy violative, but really in the end, maybe kind of incompetent from the point of view of persuasion.
Kade Crockford (25:14):
So let's just back up for a second here, because Tim is referencing Cambridge Analytica, which is a situation that some people might not remember. In the 2016 presidential election, prior to that, a private company called Cambridge Analytica got its hands on the Facebook data of 90 million or even more, mostly Americans, people in the United States. And basically sold a package to the Trump administration saying, "We can not only develop very sophisticated targeting algorithms for Facebook advertising for your campaign, but we can produce ads that we can target to very narrow demographics of voters in swing states where it counts to get those people to flip their votes to Donald Trump and to get out and vote for you." And so the Trump campaign actually embedded people from this company, Cambridge Analytica, and we later learned also Facebook employees who were essentially helping the Trump campaign and Cambridge Analytica send those advertisements.
Kade Crockford (26:30):
Oftentimes very, very narrowly tailored based on what Cambridge Analytica divined about people from their Facebook profiles and how they used Facebook. That was also used in the years prior around the Brexit vote in the UK. A variety of journalists have gotten some of their people on tape saying, "We won Brexit for the UK. We were able to swing the election in the United States for Donald Trump." Well, it's an open question whether that's actually true or not. And I think one of the things that fascinates me is this question of whether it's possible to use big data to move large groups of people to change their behavior. I think Tim's work indicates that's a conclusion that's built on shaky, unstable ground, and that we shouldn't necessarily take the word of a private company like Cambridge Analytica as gospel when it obviously has a financial interest in its customers believing that it can perform this kind of magic using big data.
Kade Crockford (27:45):
The question for me though, is what can result from the collection of all of this information? What harms are actually occurring? Maybe it's not that voters can be swayed to elect a dangerous person, a demagogue like Donald Trump. Maybe it's actually that that information can be used to target individual immigrants for deportation. That that information can be used by bad actors in states where abortion is criminalized to arrest and prosecute people based on their search histories. So I don't think we should take from this conversation that there's no danger in the online advertising industry or that the business model of the modern internet does not pose a threat to us. I just think that it's maybe not the threat that we've heard it is from scandals related to things like Cambridge Analytica.
Carol Rose (28:45):
Looking ahead, it's really actually hard to know how this data capitalism and digital advertising and all the fraud that's involved is actually going to play out. But Tim Hwang has a theory.
Tim Hwang (28:57):
And so a little bit like subprime mortgages, there was this idea in the finance industry which is you can't lose on mortgages. You put money into mortgages, you're going to make money. And it turned out that when that was a lot less valuable than we thought, there was a sudden pop in the market. People panicked. They said, "Oh, what have we been spending money on all this time?" And it caused this kind of rush of money out of the market and created a bunch of ripple effects. And the claim is that the same thing could happen here for online advertising. So we talked a little bit earlier about data capitalism and how a company like Google makes money. And at its root, it's kind of a simple idea which is they have attention. They have people who are going to their search engine. And the way they make money is that they go to advertisers and say, "Who would like to get a message in front of these people?"
Tim Hwang (29:41):
People bid is the way it works. It's a very specific kind of advertising that's known as programmatic advertising. And when programmatic advertising, what happens is basically you're about to load a webpage essentially. And a little signal goes out to the world and a bunch of algorithms will actually bid on who wants to deliver a message to say me, Tim. The narrative has been that this sort of system of delivering advertising works and it works really well. And in fact, the belief that it has worked so well is one of the reasons why Google and Facebook have grown to being some of the most valuable companies in the world. There's a lot of reasons to believe that that may not actually be the case, that the amount of money that advertisers are putting into these companies is way above what should be justified. One of the ways I think it could happen is that we have a number of privacy laws that are coming into place right now.
Tim Hwang (30:28):
And the advertisers have said, "Well, don't pass these privacy laws because if I lose access to being able to collect data about this guy, Tim, I won't be able to sell ads anymore, or people won't pay for my ads anymore." And I kind of have this really strange idea that none of that data actually makes the ads work at all. There may be no change at all when these privacy laws come into place from the point of view of the advertiser. And that sort of thing could create a panic, right? Because everybody starts saying, "Okay, what have we been spending money on all these years?" And I think that's kind of the sort of bubble that I'm talking about. But I do believe that after a crash, the kind of internet that I would like to see would just feature a broader diversity of business models.
Tim Hwang (31:08):
And so in some ways my ask is quite modest and I do believe that there's a lot of room for subscription actually, returning this relationship to being a much more direct one between a user and its services. We haven't really had to have a strong debate over whether or not, for example, having access to a search engine is as important as having access to electricity and water to your home. And it's in part because of advertising. It's been given away for free so we really haven't had to answer that question. I think in a world where things were a lot more subscription based, I think we'd have to confront the good, healthy discussion of basically saying, "What are services that we think are so key that companies are required to give access to it for equity reasons? Or is this something that we want to tax the companies in order to subsidize access?" And so I think there's a lot of benefits to subscription in part, because I think it forces this very good and important discussion about who deserves access and how do we arrange access from a democratic standpoint?
Carol Rose (32:09):
So Tim's book about this bubble, The Subprime Attention Crisis goes into a lot more detail, of course. And while it's hard to say if his predictions are accurate, I think it's safe to assume that any system with massive amounts of cash flow, they can't prove that it actually does anything, shouldn't last forever. If only because at some point the law, we hope, will catch up. Now there's a lot more to unpack here about the legal and legislative actions we could pursue. But first, we're going to take a short break to hear from Jacob Snow, my ACLU colleague in Northern California, the home of big tech, and what he's thinking and doing about big data and how Silicon Valley has changed our world.
Jacob Snow (32:52):
We don't read terms and conditions. We don't read privacy policies. So much of what is called consent online is really unwitting consent or coerced consent or incapacitated consent. And so I think there are some circumstances where transparency can be effective, but in many cases, that transparency relies on a vision of people having the ability to consent. That is really not real.
Jacob Snow (33:22):
My name is Jake Snow. I'm a senior staff attorney at the ACLU of Northern California. And I work in the Technology and Civil Liberties program at the ACLU of Northern California. And that program works, generally speaking, on kind of three buckets of substantive work. We look at government surveillance, consumer privacy, and free speech online. I'm using Chrome to speak with you right now. And the reason that I'm using Chrome is that Chrome is the only browser that's supported by this platform. And that means that even if I didn't want to use Chrome because of the privacy consequences of it, I wouldn't have an alternative. So that operates in small ways and big ways as we interact with technology.
Jacob Snow (34:03):
To take one example, we hear a lot from governments about the necessity of public safety, but the fact is that when information is collected about people, sometimes that information is used to separate families by the immigration authorities. And that is a consequence for the lived experience of people that is truly disastrous and has nothing to do with public safety. And so I think a lot of skepticism about the justifications that are offered by immensely powerful entities like corporations and governments is appropriate. And that's why strong legal protections are necessary and strong enforcement of those legal protections for every person who's impacted by them is necessary as well.
Jacob Snow (34:45):
Back in 2018, the ACLU of Northern California released the results of an investigation showing that Amazon was selling facial recognition to the police, to the Sheriff's Department in Washington and in Orlando, Florida. And we put recognition to the test and took a public database of 25,000 mugshots and searched for every current member of Congress in that set of mugshots of people who'd been arrested. And we found that 26 members of Congress incorrectly matched, and it was disproportionately people of color in that set of matches. And so we spoke out about that and it was something that I think was able to communicate to lawmakers, people who are in power who maybe aren't very worried about the possibility that they would be brought into the criminal legal system, but it highlighted the risks for them of facial recognition technology.
Jacob Snow (35:35):
The criteria that people should be applying when they look at how surveillance technology, technology in general, the artificial intelligence and machine learning technology that we're seeing so much of, the criteria people should be applying is whether those technologies empower people and protect people with strong rights. And whether those rights are enforceable in a way that scales against the largest and most powerful companies and governments in the world. And I think that we all need to take a hard look at some of the claims that might be made by those powerful companies and governments with respect to their justifications for why all this information is needed. And with respect to the purported benefits of collecting all of that information.
Carol Rose (36:26):
Jacob is exactly right. The key question we should be asking in a democracy is whether in how various technologies and the systems that surround those technologies either empower people or disempower them. It also raises what I think of as the paradox of the lawmaker's excuse to do nothing. Too often lawmakers tell us it's too early to regulate a given technology, or they tell us it's too late to regulate a given technology. But in truth, it's not too early or too late. Now is the perfect time to have this public conversation about how technology can be used in the service of liberty and democracy, and how we can pass laws to prevent the misuse of technology in ways that threaten our democracy, our liberties, and ultimately our species.
Carol Rose (37:14):
So far in this podcast, we've defined some fairly complex ideas, data capitalism, various aspects of digital advertising, and how some of our most private information can be collected and used with or without our consent. And as we continue our journey together, we'll continue to unlock even more ideas, hearing from people who are taking action in our communities to defend civil rights and civil liberties in the digital age.
Carol Rose (37:40):
So join us next time on Freedom Unfinished as we explore the modern surveillance landscape, who owns and uses our data, how the police and immigration authorities use the excuse of public safety while violating civil rights, and what artificial intelligence, machine learning, and face surveillance technology mean to our democracy and our liberty. Until then, I'm Carol Rose.
Kade Crockford (38:04):
And I'm Kade Crockford.
Carol Rose (38:06):
And this is season one of the ACLU of Massachusetts Freedom Unfinished: Decoding Oppression.
Carol Rose (38:16):
Freedom Unfinished is a joint production of the ACLU of Massachusetts, and Gusto, a matter company, hosted by me, Carol Rose and my colleague at the ACLU of Massachusetts Technology for Liberty program, Kade Crockford. Our producer is Jeanette Harris-Courts, with support from David Riemer and Beth York. Shaw Flick helped us develop and write the podcast, while Mandy Lawson and Jeanette Harris-Courts put it all together. Art and audiograms by Kyle Faneuff. And our theme music was composed by Ivanna Cuesta Gonzalez, who came to us from the Institute for Jazz and Gender Justice at Berkeley College of Music. We couldn't have done this without the support of John Ward, Rose Aleman, Tim Bradley, Larry Carpman, Sam Spencer, and the board of directors here at the ACLU of Massachusetts, as well as our national and state ACLU affiliates.
Carol Rose (39:07):
Find and follow all of season one of Freedom Unfinished: Decoding Oppression wherever you get your podcasts. And keep the conversation going with us on social. Thanks to all of our guests and contributors, and thanks to you for taking the time to listen. It's not too late to mobilize our collective willingness to act and to ensure that technology is used to enhance rather than diminish freedom. See the show notes to discover ways to get involved. And always remember to vote and not just nationally, but locally too. Together, we can do this.