Project 8 Podcast

When Technology Fails: The Troubling Case of False Arrests

August 06, 2023 Steve Season 1 Episode 4
When Technology Fails: The Troubling Case of False Arrests
Project 8 Podcast
More Info
Project 8 Podcast
When Technology Fails: The Troubling Case of False Arrests
Aug 06, 2023 Season 1 Episode 4
Steve

Send us a text

Could you imagine being arrested, accused of a crime you didn't commit, all because of an incorrect match on facial recognition technology? That's exactly what happened to our guest, Porcha Woodruff. Pregnant and innocent, Porcha was dragged from her home and subjected to 11 harrowing hours of detention due to this potentially rights-violating technology. We delve into her gut-wrenching tale and explore the wider implications of such incidents occurring, not just to Porcha, but to five other black individuals as well.

We'll also challenge the reliance on automated facial recognition by the police force in suspect identification and interrogate the possible dangers of solely depending on this method. Our conversation shines a light on the toll this ordeal took on Porcha, emotionally and physically, and how it affected her family. But we're not stopping there. As we navigate the murky waters of technology and security, we also look ahead at upcoming features of our podcast. Spoiler alert: If you're a rising music artist, there's an exciting opportunity for you to be featured on our show. Listen in, and let's engage in this essential conversation about our rights and the potential infringements in the pursuit of security.

Support the Show.

Project 8 Podcast
Get a shoutout in an upcoming episode!
Starting at $3/month Support
Show Notes Transcript Chapter Markers

Send us a text

Could you imagine being arrested, accused of a crime you didn't commit, all because of an incorrect match on facial recognition technology? That's exactly what happened to our guest, Porcha Woodruff. Pregnant and innocent, Porcha was dragged from her home and subjected to 11 harrowing hours of detention due to this potentially rights-violating technology. We delve into her gut-wrenching tale and explore the wider implications of such incidents occurring, not just to Porcha, but to five other black individuals as well.

We'll also challenge the reliance on automated facial recognition by the police force in suspect identification and interrogate the possible dangers of solely depending on this method. Our conversation shines a light on the toll this ordeal took on Porcha, emotionally and physically, and how it affected her family. But we're not stopping there. As we navigate the murky waters of technology and security, we also look ahead at upcoming features of our podcast. Spoiler alert: If you're a rising music artist, there's an exciting opportunity for you to be featured on our show. Listen in, and let's engage in this essential conversation about our rights and the potential infringements in the pursuit of security.

Support the Show.

Speaker 1:

You, you, hello, welcome to the project a podcast, steve man, your host. Just a couple of things before we get started. If you would like to donate, just put a new hero to our code at the top of the screen to scan that and We'll kindly accept your donation. If you do like, and if you do like the content of the podcast, just give us a like and share. We appreciate that too. If you're a new artist, band, musician of any type and would like to Get on the podcast, just send me an email at Steve man Dot I'm sorry, steve dot man at project a podcast Dot-com, and we will just send me an email and we'll get back to you shortly. Always interested in meet new people and up-and-coming artists to have on the podcast.

Speaker 1:

But anyway, I was looking through the internet today and one of the reasons why I wanted to hop on and I saw an article on the New York Times by Kashmir Hill. But a Woman eight months pregnant and arrested after false facial recognition. Match. We can't rely on tech that much. I mean, we're being spied on At the traffic cameras, the cameras outside. I mean, does facial recognition really work? I mean, obviously there's some. There's some bugs that need to be worked out in it. It's, it's just. It's sad, I mean it is, but this is her story. It's Portia Woodruff. I Hope I didn't chop up that name, uh, sorry if I did. I was getting her two daughters ready for school when six police officers showed up through her door in Detroit. They asked her to step outside Because she was under arrest for robbery and carjacking. Poor, she responded. Are you kidding me? She recalled saying to the officers miss Woodruff, 32 years old, said she Gestured at her stomach to indicate how ill equipped she was to commit such a crime. Me eight months pregnant, I mean, yeah, I mean, come on, she. And that she was eight months pregnant.

Speaker 1:

Handcuffed in front of her home on Thursday morning last Friday, leaving her crying children with her fiance, miss Woodruff Was taken to the Detroit detention Center. She said that she was held for 11 hours, question about a crime that she had no knowledge of and had her iPhone sees to be searched for evidence. Why would they need her iPhone? I mean, maybe to see if she was en route, but clearly she's eight months pregnant. She didn't do it right. She, portia, stated she was having contractions and the holding cell of Her back was sending her sharp pains and having spasm. I Think I probably. She said I think I was probably having a panic attack. She said a license, a nursing student, a nursing school student? A? Yeah, miss Woodruff said that she was a licensed Nursing school student and she was hurting sitting on those concrete benches.

Speaker 1:

After being charged in court with robbery and carjacking, miss Woodruff Woodruff was released that evening on a hundred thousand dollar personal bond. In an interview that she said Went straight to the hospital. In an interview she said she went straight to the hospital when she was diagnosed with dehydration and given two bags Of of intravenous fluids. A month later, two weeks before giving birth to her son, the Wayne County prosecutor Dismissed the case against her. The.

Speaker 1:

The ordeal started with An automated facial recognition, facial recognition search. According to the investigative Investigative report from the Detroit police department, miss Woodruff Is the sixth person to report being falsely accused of a crime as a result of facial recognition Technology used by police to match an unknown, unknown, unknown offenders Face to a database. All six first people have been black. Miss Woodruff is the first woman to report it happened into her. Everybody's different. Everybody has different eyes. Everybody has different facial features. Yeah, we got Doppler gangers out there that we may resemble or they may resemble us, but for the most part, you know, you know, you know, you know you are who you are. You know, Uh, this, this facial recognition technology, uh needs to be revamped if it's going to be even used, uh, in court.

Speaker 1:

And is it a violation of your rights? Walking down the street and you get tagged by one of these things, the next thing you know your your door's being, you know, knocked on, saying that you committed a crime, just like this, uh, young woman here, young woman here. That ain't right. I mean, six people were falsely accused and they happened to be black. I feel sorry for them because obviously it wasn't them. This woman is eight months pregnant. Obviously she's not going to be carjacking anybody, and that you know. And plus, she's a registered nurse. She's doing some, you know, and she's going to school. I think it said in there that she was a registered anesthesiologist and a nursing student. Okay, so she's a professional. So I don't know why it would even be, why the police would think that this was their person that committed a crime.

Speaker 1:

Just the technology of these days, it's a far reach. I think a lot of it is unconstitutional. It should be banned. I know in certain states traffic cameras were put up and then ruled unconstitutional and taken down. We don't need to be. We don't need to be China, you know, let China do all that shit over there. We don't need it here. We're the land of the free and the home of the brave. We don't need that stuff. There's warrants. So if somebody, you know, if there's evidence of somebody doing a crime, issue a warrant, then we'll have a discussion whether I did it or not. So, or if anybody did it or not, it just seems we're turning to AI. We're turning all this surveillance stuff. We're constantly being looked at by big brother. Our phones are monitored. I mean we can't even walk in our own home without being reported. You know, this poor woman is a victim of misidentification here.

Speaker 1:

It said that this was her, but evidently it wasn't. So they go on, continue that. Where there was dropped, it said the Detroit police department is an eight. Here's a statement saying this is the third case involving Detroit police department, which runs an average of 125 facial recognition searches a year, almost entirely of on black men, according to weekly reports about the technology use use technologies used provided by the police department board of police commissioners, a civilian oversight group. Critics of the technology say that the case exposed or, I'm sorry, the technology say the cases exposed its weaknesses and dangers pose to innocent people. That's what I just said. Right, the police, the Detroit police department, is an agency that has every reason to know all the risks that use in face face face recognition, terry said Claire Garvey, an expert on the technology at the National Association of Criminal Defense Lawyers. And it's happening anyway.

Speaker 1:

On Thursday, ms Woodruff filed a lawsuit for wrongful arrest against the city of Detroit in the US district court for Eastern District of Michigan. Yeah, good for her. She was wrongly arrested over facial recognition facial recognition saying that this was her. The Wayne County prosecutor, kim Worthy, considers the arrest warrant and Ms Woodruff's case to be appropriate based upon the facts, according to a statement issued by her office. Now here's the investigation.

Speaker 1:

On Sunday night, two and a half weeks before police showed up to Ms Woodruff's door, a 25 year old man collected or, I'm sorry, a 25 year old man called the Detroit police from a liquor store to report that he had been robbed at gunpoint. According to a police report included in Ms Woodruff's lawsuit, the robbery victim told police that he had picked up a woman on the street earlier that day. He told police that they had been drinking together in his car, first in a liquor store parking lot where they engaged in some sexual manner, and then at a BP gas station. When he dropped her off at a 10 minutes away, a man there gun, took the victims wallet and cell phone and fled in the victim's Chevy Malibu, according to the police report. Now here's the thing. Okay, the victim would have seen that Ms Woodruff was pregnant if that was her Right. Right, I mean that would be stated in the report.

Speaker 1:

As far as the description, and I know when stuff happens to victims it's just you know, go, go, go, go go, and everything's flashing in your mind and you don't get every detail. But when you're a victim of a crime, just stop and think of every detail, play it back as a movie before answering any questions, write it down on a piece of paper. This is what I saw, okay, and in this case, if she was pregnant, he would have said she was pregnant. I mean she was eight months pregnant. I mean I have, I have sick children, okay, and I know a woman when she's pregnant. So just saying I mean I've seen a picture of this woman. She looks average, you know, average size. So it's not like she's a big woman and but I mean, if she's pregnant you're going to tell you know, it's not just that she just out of shape, she is 100% pregnant. So again, if you're a victim of a crime, take a deep breath and go back and write down every detail of what happened. That way the police can get the right person. Okay, I don't trust this facial recognition technology at all and I don't think it should be used at all. Again, I think it's on competition.

Speaker 1:

Days later, the police arrested a man driving the stolen vehicle. A woman matched the description given by the victim, dropped off his phone at the same BP gas station that the police report said Wait a minute, let's go back there. A woman who matched the description given by the victim dropped off his phone at the same BP gas station. The police report said If somebody stole your car and your phone's in there, I don't think that they're going to just go back out of their way and drop off your phone at a gas station. Am I reading this right? A woman matched the description given by the victim, dropped off his phone at the same BP gas station. The police report said I don't know, man, this whole story stinks. A detective with the police department's commercial auto theft unit got surveillance video from the police I'm sorry, from the BP gas station.

Speaker 1:

The police report said and asked a crime analyst at the department to run a facial recognition search on the woman. According to city documents, the department uses facial recognition, uses a facial recognition vendor called DataWorksPlus to run unknown bases against database of criminal mug shots. The system returns matches ranked by likelihood of being the same person. A human analyst is ultimately responsible for deciding if any other matches are a potential suspect. So we have human error in this story as well. Right, so the facial recognition comes up and says, hey, these are the likely matches, but it's up to the operator to determine ultimately if the matches are the suspect or not. Minority report anybody. I mean not the same story, but I mean you know, in the movie the pre-COGs witnesses the event and then the guys have to analyze everything and see if it's human. You know this is human error with some technology. But I'll continue reading this story. We all make mistakes. But, man, the police report said the crime analyst gave investigator Ms Woodruff's name based on a match to a 2015 mug shot.

Speaker 1:

Ms Woodruff says in an interview that she had been arrested in 2015 after being pulled over while driving with an expired license. She was arrested for having an expired license. Where I'm from, you would get a ticket on that. Get your license straight now, take it to court with the ticket and then that's it. You want to get arrested for it. I don't know what goes on up there in Michigan, but to get arrested for an expired license, that's again. I don't know the laws up there, but that seems a little extreme.

Speaker 1:

So Gary Wells, a psychology professor who has studied the reliability of witness identification, said pairing facial recognition technology with an eyewitness identification should not be the basis for charging someone with a crime. Hello, I like Gary Wells already. I mean, it makes sense, right? Even if that similar comparison is innocent, an eye witness who is asked to make the same comparison is likely to repeat the mistake made by the computer Exactly. It's like yeah, yeah, that's her, that's her. When they you know they could look similar Again. We all have a Doppler Ganger out there, right. When I was younger, people used to say I looked like Ben Affleck. Never saw it myself. That guy's handsome? Yeah, me not so much. But you know stuff like that, it's. It is circular and dangerous. Dr Wells said you've got a very powerful tool. If it searches enough, faces will always yield people who look like the person in the surveillance image.

Speaker 1:

Dr Wells said the technology compounds an existing problem with eye witnesses. They assume when you show them a six pack the real person is there. He said so. I'm guessing a six pack is six photos and the real person's in there. I'm not sure what they're trying to. What are you saying there? Serious consequences, excuse me, sorry, serious consequences.

Speaker 1:

The city of Detroit faces three lawsuits for wrongful arrest based on the use of technology. The use of this technology Shoddy technology makes shoddy investigations, and police assurances that they will conduct serious investigation Do not ring true, said film Mayor, a senior staff attorney at American civil liberties union of Michigan. Yeah, I mean, what happened to the old fashioned police work? We have a photo of the person and you go around, you ask witnesses hey, did you see so and so at this? You know, did you see this person at this time of day? You know here's, you know, is this it? You know, instead of the facial recognition technology, it's going to get a lot of people in trouble. There's going to have a lot of lawsuits. I mean this one that has three lawsuits already during this story.

Speaker 1:

This Woodruff said that said she was stressed for the rest of her pregnancy. She had to go to the police station the next day to retrieve her phone and appeared in court hearings twice by zoom before the case was dismissed because of insufficient evidence. This woman is pregnant. She's under stress, thinking she's going to go to jail for a crime sheet and commit and you know they two hearing two court hearings. Yeah, two court hearings by zoom. So instead of just releasing her with her phone, they got an inconvenience there to go retrieve her phone the next day and then have to zoom hearings to put to to plead her innocence. I mean, it's scary. This is, this is, this is what somebody said. It's scary. I'm worried. Someone always looks like someone else. I just said it right, her set her attorney, ivan Island.

Speaker 1:

Facial recognition is just an investigative tool. If you get a hit, do your job and go further and knock on her door. Exactly good. Oh, don't rely on the facial recognition that say, okay, we got our person. This is, this is our person, right here, don't not the way to go. They can give you Similarity and then you can go investigate, but good, old-fashioned police work, right, good detective, would you know? Fine, his or her perpetrate.

Speaker 1:

Miss Woodruff said that she was embarrassed to be arrested in front of her neighbors and daughters. Oh, she wasn't. She was embarrassed and being arrested in front of her neighbors and her daughters were traumatized. They now tease her infant son that was in jail before he was born. The experience was all more difficult because she was far along, so far along, in her pregnancy. But miss Woodruff said she feels lucky that she was. She thinks it convinced authorities that she did not commit the crime.

Speaker 1:

The woman involved is a woman who was in a criminal case. The woman involved in the carjacking had been visibly had not been visibly pregnant. Again, back back to the eye witness. Okay, he would have said that she you know, he, she was a pregnant woman, if that was, if miss Woodruff was the actual perpetrator here, okay, but she wasn't. She was wrong, wrongfully accused. And I picked this story out because of the facial recognition Technology. And that this woman was wrongfully accused, okay, pregnant woman, wrongfully accused. So it goes back.

Speaker 1:

You know, we we can't. We have technology all over the place. We it's in our life, 24 hours a day, seven days a week. We choose that we just need to put the phones down for a minute and think and get back to human-to-human interaction instead of texting each other. I was sitting in a restaurant and and we were waiting to be seated and I was looking over at Couple of teams and they were talking to each other on tax and that's up on and they look at each other. I mean, if you want to talk, talk, you don't have to talk on your devices. You know, I Send my my kids to school because I want that public interaction. I Want them social skills. I don't want them sitting on the phone and I'm sitting on the game 24 hours a day, seven days a week, playing with their friends, you know. So we can't rely on technology to do Our everyday activity.

Speaker 1:

But with with the startup of AI, I mean, would this, if this was like a AI surveillance, would you get the same outcome? I Think, if they were, if this was a AI technology Facial recognition that she would have been falsely accused again. You know, we're just AI's. Ai is getting big. We're we're getting to a point where the machines are gonna be replacing us. I do believe. I mean, call me crazy, call me conspiracy theorist, but we're putting a lot of stock into this stuff, man, lot of stock. I wish we can go back and just have a regular phone, whether it be a Nokia, you know, hit number three just to say whatever, I can't even remember anymore. But technology is great to a point. I mean in the auto industry it helps them diagnose cars and that stuff like that which the mechanic has to replace whatever part the computer tells them to To get their error codes out. I'm not 100% against technology, I just think we need to use it in a smart, responsible way.

Speaker 1:

I really don't have a problem with the smartphone, except for we rely too much on it to communicate five foot from a person. Or I don't know, I don't know or you're not interacting with people. You know. You're just sitting on your phone scrolling, scrolling, scrolling For entertainment, when the person's next to you I'm guilty of it, me and my wife could be laying in bed and talking and then all of a sudden she picks up her phone, starts playing whatever game she's playing. I pick up my phone and play whatever game I'm playing, it happens or read a book or something. So I mean, we spend our time watching TV together. What we do is we find a show to stream and we stream it together. Nobody's on their phones. We talk about it during the break of episodes and episodes. So you know, technology's good, but we need to slow down a little bit.

Speaker 1:

This poor woman, portia Woodruff I mean she got a bad rap from the facial recognition and I hope she gets a good settlement in this lawsuit for damages and stress and thank God that she was able to have that baby. I mean, can you imagine if she was stressed out enough and lost that baby, I mean, just by the mistaken identity from a facial recognition software? But then again, hold on now. There was a lot of people who were like, hold on now. There was also another human aspect in that, where the technician of had to go back and verify and say, hey, yes, this is the suspect. So I did pick that story out today because it crossed my mind when I was looking at the New York Times and they had everything else the Putin forever war and stuff in there about Biden and all that. But I wanted to give this young woman, portia Woodruff, some press on this because it's well deserved. I mean that facial recognition caused a lot of pain in her life and a lot of unnecessary aggravation and an embarrassment.

Speaker 1:

So if you liked the podcast today, give us a like and share. If you would like to donate, you can go to our website at wwwprojectdatepodcastcom, scroll down to the bottom and hit that donate button. I'm not. I don't have any sponsors. I don't do the sponsor thing right now. I just wanna get out there and get the stories out there and get some press on what I think is important.

Speaker 1:

This is Sunday. Normally I try to do podcasts every Saturday night, but we had an event yesterday that we had to go to and I wasn't able to do it. So but I hope you guys are enjoying the content, enjoying this ugly mug talking to you and entertain it, hopefully entertain you and getting some points across about what's going on today. Excuse me, sorry, and get a little frog in my throat, but I hope you like it. Please leave a comment, be kind.

Speaker 1:

This is my fourth episode, so it's not like episode 1001 and I'm still talking like this. Like I said, I'm new at this. I enjoy it. I really do. I'm a little nervous at times, but I really love it and I like getting the stories out there and sharing different people's experiences. Hopefully, very soon we will have some musical artists on, some up and coming musical artists on to share with you. So if you're interested, if you're a new band or you're just a solo guy or gal and want to get some exposure, just hit me up an email at Steveman at project8podcastcom. Okay, I look forward to hearing from you guys on that. And so, with that said, god bless America, god bless every one of you and we'll see you real soon.

Facial Recognition Technology's False Arrests
Facial Recognition Technology and Wrongful Arrests
Podcast Update and Future Guests