Hi there,

In the midst of the U.S.-Israeli war on Iran, you can count on Democracy Now! – the war and peace report – to cut through the noise of a relentless news cycle with in-depth coverage of the human cost of military aggression and voices from across the globe calling for peace. Please donate today, so we can keep bringing you fact-based, independent journalism that exposes injustices and brings perspectives typically ignored by the powerful into the daily news conversation, as we have for 30 years. Every dollar makes a difference. Thank you so much!

Democracy Now!
Amy Goodman

Non-commercial news needs your support.

We rely on contributions from you, our viewers and listeners to do our work. If you visit us daily or weekly or even just once a month, now is a great time to make your monthly contribution.

Please do your part today.

Donate

Social Media Addiction: Facebook Whistleblower Says Big Tech Has Known & Ignored Problem for Years

Listen
Media Options
Listen

We continue our conversation with attorney Laura Marquez-Garrett and victim advocates Lori Schott and Lennon Torres about their fight to hold tech giants accountable for the damaging and even deadly effects of social media addiction on children and young adults. We’re also joined by Frances Haugen, a former Facebook employee who blew the whistle on several of the company’s harmful and manipulative practices in 2021. Haugen says mega-rich tech “oligarchs” like Mark Zuckerberg cared about teenagers only as people who could bring others onto the platform. “They worried about public perception, not the actual health of the kids,” says Haugen, adding that companies like Zuckerberg’s Facebook “underinvested in the safety of children,” ignoring years of warnings about the psychological impacts of their products on child development in favor of “optimiz[ing] for spending more and more time on these platforms.”

Transcript
This is a rush transcript. Copy may not be in its final form.

AMY GOODMAN: This is Democracy Now!, democracynow.org, The War and Peace Report. I’m Amy Goodman, with Nermeen Shaikh.

NERMEEN SHAIKH: We’re continuing to look at the landmark trial about the harms of youth social media addiction taking place in Los Angeles. Meta CEO Mark Zuckerberg testified on Wednesday. Meta is the parent company of Facebook and Instagram.

AMY GOODMAN: We’re also joined by Facebook whistleblower Frances Haugen. In 2021, she turned over tens of thousands of pages of internal Facebook documents to U.S. regulators in The Wall Street Journal, which became the basis of a damning series of reports called “The Facebook Papers.” Her memoir is titled The Power of One: How I Found the Strength to Tell the Truth and Why I Blew the Whistle on Facebook.

Frances, yesterday, yes, Meta CEO, billionaire Facebook founder Mark Zuckerberg was grilled about Instagram’s effect on the mental health of young users. What struck you most about what he had to say? What was true? And do you believe he told the truth?

FRANCES HAUGEN: I don’t know if you’ve noticed over the last couple years that he’s shown up on a number of podcasts where he often gets tossed pretty soft balls. What was remarkable about reading the coverage of Mark’s testimony yesterday was how different it is when the person sitting across from him asking the questions gets to ask hard follow-up questions. You know, things like he’s bragged before on other podcasts about not fearing ever being fired, because he could — because of his stock ownership in the company, he owns the majority of voting shares or the majority of the votes; he could just put in a new board if they tried to fire him. You know, having to actually have a meaningful conversation about what is the implication of that level of extreme control, he’s never had to sit there and squirm while being asked questions like that.

NERMEEN SHAIKH: And, Frances, just tell us — you’re a former Facebook whistleblower and advocate for social media transparency. What did you learn being within Facebook? What did the companies know about the addictive potential of their algorithms and content?

FRANCES HAUGEN: So, within the child safety space, you could say there’s kind of two major areas. One is the well-being of kids. And I would say that’s where work on addiction falls. And then I would say child safety, like dealing with predators, dealing with people who distribute child abuse material. I worked within threat intelligence, which was the — you know, they’re the nitty-gritty, hands-on investigators going in there and hunting individual people who were running scams, running terrorist organizations. My team worked on counterespionage.

So, the closest I saw firsthand how Facebook underinvested in the safety of children was that the team that was responsible for finding people who were distributing child abuse material or looking for adults that preyed on children or finding these marketplaces where adults solicit nude photos of children from the children directly, that team was so strapped for resources that if you had given them a single engineer more, they probably would have accomplished 10 times as much. That’s the attitude I saw repeated at Facebook firsthand.

The documents I brought out, though, painted an even broader picture. So, while I saw things that made me uncomfortable when it came to kids, what the documents of Facebook showed was that Facebook viewed them very instrumentally. You know, they cared about the role of teenagers in bringing in their younger siblings, bringing their parents onto the platform. They worried about public perception — not the actual health of the kids, but the perception that these products might be addictive. And what we know now that these documents have been brought forward because of the court cases is that there were lots and lots of experiments being run to make these products safer. They knew that the kids said these changes, things like don’t send me an alert in the middle of the night, made kids less stressed, let them sleep better. And yet they didn’t launch them, because it also made them use Instagram 1% less.

AMY GOODMAN: So, what did these companies know, and when did they know it? I mean, you have Zuckerberg testifying. This is under oath. Last week, after Instagram CEO Adam Mosseri spoke on the witness stand, he pushed back on the science behind social media addiction by denying users could be clinically addicted. Can you talk about that and the case that’s at the heart of the trial, the 20-year-old woman known as “K.G.M.” who says her addiction to using YouTube and Instagram worsened her depression and suicidal thoughts?

FRANCES HAUGEN: So, this one of these little turns of phrase that I think is a wonderful illustration of how Meta, Facebook, Instagram have gotten very, very good at speaking in a very precise style, where factually you can defend what they say. So, addiction is a medical term. So, if I get addicted to painkillers, if I get addicted to cigarettes — like, going off of cigarettes, because your nervous system has become dependent on nicotine, you go through very intense physiological symptoms. Or you go off painkillers, you’re going to literally be vomiting, right? When kids stop using social media, if you isolate that child, you do see things that are indicative of their brain chemistry is changing. You know, they’re used to tons and tons of dopamine. Now they don’t have that stimulation. But I think any parent who gives their kids a lot of screen time and then tries to have them sit still for dinner or, like, makes them go on family vacation and leave their phone behind sees that behavior change.

But from a medical standpoint, that behavioral dependence is not considered medically to be addiction. But when you come in there and downplay what happens with compulsive use, which is the scientific term of art around these things, it really downplays how having a generation of children who get hooked at 7, 8, 9 — because, remember, 30% of 7-to-9-year-olds were on social media as recently as 2022. Imagine what it is today.

When you have kids whose brains are being just bathed in dopamine from scrolling all day at such a young age, it changes how — their ability to sit still in class, to interact meaningfully face to face with their family or friends. And you see this from a growing number of reports from teachers who say, “I’ve taught for 20 years, and I don’t understand what’s going on. The kids who come into my seventh grade class have never behaved like this before.” So, that’s kind of the consequences that we’re living with of having — as the court documents show, they’ve been getting warned about this for 10-plus years, and yet they continue to optimize for spending more and more time on these platforms.

AMY GOODMAN: As we said, we’re also still joined by Lori Schott, who lost her daughter Anna and is attending the trial, Lennon Torres, as well, and Laura Marquez-Garrett. Nermeen?

NERMEEN SHAIKH: So, Lori, let’s go back to you. You said about your daughter — because it’s not just concerns about what these children, young people are putting online, but also what they’re receiving in response. You said about your daughter, “I was so worried about what my child was putting out online, I didn’t realize what she was receiving.” So, could you tell us what precisely she was receiving and how you came to see it?

LORI SCHOTT: Well, with Annalee, after she passed away, we were able to gain access into her platforms. And from what I saw at first, opening that as a parent who lost a child and trying to understand this world of social media, it brought me to my knees. This child was pushed content about anxiety and depression. And it wasn’t just one or two screenshot images; it was pushed to her constantly. And it was just time and time again the same theme that pulled her down that rabbit hole. It hooked her into this world that just destroyed her mental health, that told her she wasn’t good enough, that told her she was broken. My daughter was not broken. These platforms are broken. Mark Zuckerberg is broken. And we’re here to fight for change on that, because no child should be exposed to what my daughter saw. And we tried tirelessly as parents to guide her, and she was a beautiful person, and they took that person away from us.

AMY GOODMAN: Laura Marquez-Garrett, you were just with both Lori and with Lennon. You were attending the trial. You’re here in New York. As you were getting ready for the show, I saw your tattoo — 

LAURA MARQUEZ-GARRETT: Yes.

AMY GOODMAN: — on your forearm. Can you — can you pull back your jacket — 

LAURA MARQUEZ-GARRETT: Sure.

AMY GOODMAN: — and show us the significance of this tattoo?

LAURA MARQUEZ-GARRETT: Sure. So, I have two. And these are my children’s names, and each of the rays of sun is a child that’s been lost to a social media and/or AI product. Right? And these are patterns.

AMY GOODMAN: How many?

LAURA MARQUEZ-GARRETT: Two hundred and ninety-six. That’s not all of them. That’s just the ones that we’ve either drafted and filed their complaints, or we’ve met their parents and come to love these kids.

AMY GOODMAN: And what does it mean when you file 1,200 complaints? Complaints to who? What happens?

LAURA MARQUEZ-GARRETT: There’s unlimited. So, we file them in state and federal courts. Look, we’re filing them — this is one of many cases. We have the federal, the MDL, where you have the attorney generals, you have school districts. We have one-off state court cases in Delaware, in New York. And we had Vermont, Connecticut. Essentially, with these companies, I mean, these are the best-resourced companies in the world, and so it is a fight, it is a battle. And we are trying to find those points where they need to be held accountable, where we can break through and where we can hold them accountable.

NERMEEN SHAIKH: So, Laura, as you mentioned, of course, you’re an attorney for the Social Media Victims Law Center. If you could give us an example of what it would mean to place restrictions on social media use that would protect children? We just heard earlier in our introduction about a child who died while participating in a blackout or choking challenge. So, just explain: What kinds of restrictions are we hoping for?

LAURA MARQUEZ-GARRETT: Sure. So, these are design defects, right? So, think of the Pinto. It’s not a restriction as much as fixing it and fixing a defective part, a part that is exploding. It’s not — you know, someone asked me the other day, “Well, Meta is saying they’re fixing it. They’re doing this.” And my answer was that’s kind of like sticking a fire extinguisher in a Ford Pinto and saying, “There you go. We fixed it.” It’s still a defective part. And so, ultimately, fixing it — and they know this; it’s in their documents, that are becoming public — they could remove the addictive mechanisms. They could — it’s as simple as — think of your television set at home. We have this remote control. We get to turn the volume down, up, change the channels. They’ve kept those controls on the back end. Right? They could give you the option to slow down the algorithm. They could —

AMY GOODMAN: And explain the algorithm. For people who aren’t familiar with social media, what does it mean when Lori Schott said that her daughter Annalee kept being fed with the same thing, going down a rabbit hole?

LAURA MARQUEZ-GARRETT: Sure. So, these are social media algorithms specifically, because there are many types of algorithms, and these companies have programmed them for engagement first. So, you know, we have instances where children — and we have the data — children will look for uplifting speeches, inspirational quotes, and they will get breakup and suicide. Right? You have — and we have this over and over. We literally have images we can provide where a child looked for this and got the opposite, not because they wanted it, but because TikTok determined this is what will keep this kid hooked.

Now, if a child is going through a breakup — and this is Mason Edens specifically — he looks for — I believe it was inspirational quotes. If TikTok had shown him what he sought out, what would he have done? Well, potentially he would have put it down, and he would have talked to someone. They don’t want that. Right? So they are programming for engagement above all else. That is a programming choice. It is frankly a form of profiling, right? They’re taking thousands of data points that our children — we, as consumers, we’re not consenting to that. We’re not saying yes. I mean, they can tell you the kind of car you drive, your education, all of these pieces that they can then use to profile and target. And in the case of vulnerable children, it’s deadly.

NERMEEN SHAIKH: So, you mean that — I mean, literally, in this case, a child is looking for inspiration — 

LAURA MARQUEZ-GARRETT: Yes.

NERMEEN SHAIKH: — and instead gets, in fact, quite the opposite. I mean, the way that we are as adults familiar with the way that algorithms work is that, for example, on a news site you look at the same news site, you keep getting fed the same news site. This is extraordinary that the child is receiving something that is exactly the opposite. And you’re saying that’s because it’s more addictive?

LAURA MARQUEZ-GARRETT: No, that’s the defect. So, you’re thinking of, like, search engine algorithms, right? If I go into a typical search engine, it’s programmed differently. It’s designed differently. If I search for a Chinese food restaurant, I will get: “Here are some Chinese food restaurants within a five-mile radius.” If I go onto Instagram, I may end up with a beheading video in China — right? — something that — and actually, and I’ll use an LGBTQ instance, because I had a young person say to me once, you know, “When I would look up gay pride on Instagram, I would get half gay pride and half Westboro Baptist Church, you’re going to hell.” That is what these algorithms are doing. They’re not showing our children what our children are asking to see. They’re programmed to show them what they think they cannnot look away from, which are car accidents, extremes, outrage, all of these things. And that is what is causing harm.

AMY GOODMAN: Facebook whistleblower Frances Haugen, as we wrap up, you see at the Trump inauguration the billionaire brotherhood that includes Mark Zuckerberg. How does that influence the lack of regulation that we’re seeing today?

FRANCES HAUGEN: Before I do that, I want to put one tiny little thing and to add a cherry on top of what Laura said. There’s a lot of really basic things that you can do even to make the current system safer. For example, they know. They’ve asked users before and said, “Does this content make you feel bad?” And if they saw people beginning to look at more and more and more content that people say, “When I see this, it makes me feel bad,” if they just gave people a choice and said, “Hey, we notice you’re looking at more and more depressing content. Do you want to keep doing this?” You can do very simple things like this.

But when we look at the oligarchs that run these tech companies, we’ve set a norm that we are supposed to “just trust them,” that they’ve given us these wonderful, quote, “free” gifts, even though the price of these gifts is ourselves, our kids, our data. We’ve been told, “You’ve been given such great gifts. We know so much. Just trust us.” And that age of “just trust us” needs to end. We need real accountability, real transparency, so that people who build better things get rewarded for bringing us social media that actually is good for us.

AMY GOODMAN: Lennon, we just have 30 seconds. You’re with the Heat Initiative, which focuses on applying strategic pressure to Big Tech companies. But I want to ask you, as a young person, your advice to young people on social media now.

LENNON TORRES: My advice is to demand better and to force Mark Zuckerberg and the other lazy, lack-of-innovative CEOs to step to the side and let true innovators show you what digital community and connection can actually look like.

AMY GOODMAN: Lennon Torres, senior manager of programs and campaigns at the Heat initiative. Facebook whistleblower Frances Haugen. Lori Schott, her 18-year-old daughter Annalee died by suicide in 2020. And Laura Marquez-Garrett, attorney at the Social Media Victims Law Center based in Seattle. Laura was just named to the Time 100 list of most influential people in health and are featured in the documentary Can’t Look Away. Laura will be honored tonight by Time.

Coming up, we look at the breaking news: The brother of the king, Andrew Mountbatten-Windsor, has been arrested because of the Epstein files. And we’ll look at the number 4,400. That’s the number of times judges around the country have ruled the Trump administration is detaining immigrants unlawfully. Stay with us.

[break]

AMY GOODMAN: Billy Bragg singing “Tomorrow Is Going to Be a Better Day” in our Democracy Now! studio.

The original content of this program is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Please attribute legal copies of this work to democracynow.org. Some of the work(s) that this program incorporates, however, may be separately licensed. For further information or additional permissions, contact us.

Next story from this daily show

Courts Have Ruled 4,400+ Times That ICE Jailed People Illegally; Despite Rebukes, ICE Keeps Doing It

Non-commercial news needs your support

We rely on contributions from our viewers and listeners to do our work.
Please do your part today.
Make a donation
Top