Hi there,

If you think Democracy Now!’s reporting is a critical line of defense against war, climate catastrophe and authoritarianism, please make your donation of $10 or more right now. Today, a generous donor will DOUBLE your donation, which means it’ll go 2x as far to support our independent journalism. Democracy Now! is funded by you, and that’s why we’re counting on your donation to keep us going strong. Please give today. Every dollar makes a difference—in fact, gets doubled! Thank you so much.
-Amy Goodman

Non-commercial news needs your support.

We rely on contributions from you, our viewers and listeners to do our work. If you visit us daily or weekly or even just once a month, now is a great time to make your monthly contribution.

Please do your part today.


Pregnant Woman’s False Arrest in Detroit Shows “Racism Gets Embedded” in Facial Recognition Technology

Media Options

Image Credit: Law Offices of Ivan L. Land, P.C. (photo right)

A shocking story of wrongful arrest in Detroit has renewed scrutiny of how facial recognition software is being deployed by police departments, despite major flaws in the technology. Porcha Woodruff was arrested in February when police showed up at her house accusing her of robbery and carjacking. Woodruff, who was eight months pregnant at the time, insisted she had nothing to do with the crime, but police detained her for 11 hours, during which time she had contractions. She was eventually released on a $100,000 bond before prosecutors dropped the case a month later, admitting that her arrest was based in part on a false facial recognition match. Woodruff is the sixth known person to be falsely accused of a crime because of facial recognition, and all six victims have been Black. “That’s not an accident,” says Dorothy Roberts, director of the University of Pennsylvania Program on Race, Science and Society, who says new technology often reflects societal biases when built atop flawed systems. “Racism gets embedded into the technologies.”

This is a rush transcript. Copy may not be in its final form.

AMY GOODMAN: Professor Roberts, I wanted to end by asking you about this shocking story out of Detroit, Michigan, involving a woman named Porcha Woodruff. She was eight months pregnant when police arrested her at her door for robbery and carjacking. Six officers showed up at her home as she was getting her daughters ready for school. She was held for 11 hours, released on a $100,000 bond. She says she started having contractions in jail, had to be taken to the hospital after release due to dehydration. A month later, prosecutors dropped the case because the Detroit police had made the arrest based on a faulty facial recognition match. According to the ACLU, Woodruff is at least the sixth person to report being falsely accused of a crime as a result of facial recognition technology — all six people Black. Porcha Woodruff is now suing the city of Detroit.

The New York Times had a major story on this, saying, “Porcha Woodruff thought the police who showed up at her door to arrest her for carjacking were joking. She is the first woman known to be wrongfully accused as a result of facial recognition technology.” She was 32 years old. “They asked her to step outside because she was under arrest for robbery and carjacking.” She looked at them. She pointed to her stomach. She was eight months pregnant. And she said, “Are you kidding?”

Professor Roberts, can you talk about the significance of this and what she went through in that last month of pregnancy?

DOROTHY ROBERTS: This story captures so much of what we’ve been talking about, so much about the devaluation of Black people’s lives, Black women’s lives, and the way in which these deep myths about Black biological difference and inferiority, and the need for regulation and surveillance, get embedded into technologies. They’re embedded in medical technologies. They’re embedded in policing technologies. They’re embedded in artificial intelligence algorithms and predictive analytics.

And so, just one piece of this is the fact that the six cases we know of false arrest based on false AI facial recognition are involving Black people. Now, that’s not an accident. That’s because racism gets embedded into the technologies. It’s in the databases, because the databases are based on police arrests already or police action, which we know is racially biased or targeted at Black people. And so the data itself gets embedded with racism. The way in which algorithms are created have assumptions that are racist. With the facial recognition, the way in which the recognition technology is created is more likely to target Black faces. All of this has been shown in research. So, there’s this idea that AI is going to be more objective than the biased decision-making of judges and police and prosecutors, but if it embeds prior biased decisions, it’s going to produce these oppressive outcomes. And also, if it’s being used by police departments that are racist, they’re going to be used in racist ways.

And that gets me to the next point, which is the way in which she was treated. She, as an obviously eight-month-pregnant woman, was treated cruelly and inhumanely by these police officers, which reflects the way in which police interacted with Black communities in general, but also the devaluation of Black women’s childbearing — again, back to this point we started out with — the devaluation of the autonomy, the worth, the humanity of Black women. And a key aspect of that, in fact, a key aspect of the subjugation of Black people in general, has been the devaluation of Black childbearing. The idea that Black women pass down negative, depraved, antisocial traits to their children, almost sometimes it’s stated in biological terms. And that devaluation of Black women, especially in terms of their childbearing, is part of the basis for reproductive servitude, which we were talking about earlier, but also part of the reason why Black women are three times more likely to die from pregnancy-related causes, maternal mortality, than white women in America.

So, this one incident reveals this deeply entangled way in which carceral systems in America rely — rely — on this myth of biological race and innate inferiority of Black people, which is so deeply embedded that many people just take it for granted. It’s —

AMY GOODMAN: Professor Roberts, we’re going to continue this discussion with our next guest.


AMY GOODMAN: And we thank you so much —


AMY GOODMAN: — for being with us. Professor Dorothy Roberts is director of the University of Pennsylvania Program on Race, Science and Society, author of a number of books, including Torn Apart: How the Child Welfare System Destroys Black Families — and How Abolition Can Build a Safer World.

Next up, we’ll speak with the pioneering legal scholar Kimberlé Crenshaw, author of the new book, #SayHerName: Black Women’s Stories of Police Violence and Public Silence. Stay with us.

The original content of this program is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Please attribute legal copies of this work to democracynow.org. Some of the work(s) that this program incorporates, however, may be separately licensed. For further information or additional permissions, contact us.

Next story from this daily show

#SayHerName: Kimberlé Crenshaw on Black Women Killed by Police & DeSantis’s New Pro-Slavery Curriculum

Non-commercial news needs your support

We rely on contributions from our viewers and listeners to do our work.
Please do your part today.
Make a donation