You turn to us for voices you won't hear anywhere else.

Sign up for Democracy Now!'s Daily Digest to get our latest headlines and stories delivered to your inbox every day.

“Can’t Look Away”: New Documentary on Online Safety Examines the Dark Side of Social Media

Web ExclusiveApril 04, 2025
Listen
Media Options
Listen

Can’t Look Away: The Case Against Social Media is a new documentary that examines the tragic consequences that the algorithms of Big Tech companies can have on children and teens. This is our extended conversation with director Matthew O’Neill and a lawyer featured in the film, Laura Marquez-Garrett, from the Social Media Victims Law Center in Seattle.

Transcript
This is a rush transcript. Copy may not be in its final form.

AMY GOODMAN: This is Democracy Now!, democracynow.org, The War and Peace Report. I’m Amy Goodman, as we continue with Part 2 of our conversation about a new documentary being released by Jolt today called Can’t Look Away. The co-director is Matthew O’Neill. Laura Marquez-Garrett is one of those featured in the film. She’s a lawyer with a somewhat new group that’s called Social Media Victims Law Center. It represents 4,000 children at this point and growing.

Matt, we started Part 1 by my asking you about why you did this film. It is an astounding story. When you think back to Inauguration Day, it wasn’t the Cabinet nominees that were front and center of Trump’s inauguration; it was the Big Tech CEOs, I mean, people like Jeff Bezos; Zuckerberg, Meta; Bezos, Amazon; Google — all of these guys were front and center. Why? What do they have to gain? And how does that relate to the suffering of children?

MATTHEW O’NEILL: Well, if you think about it, just a year earlier, a year before inauguration, they were all sitting in front of a bipartisan committee that was working to hold them to account for what’s happening on social media. And a ban — or, sorry, the Kids Online Safety Act, which would help protect children, passed in the Senate with a bipartisan coalition last summer of like 93 to 3. I think there were only three votes against it. So there’s bipartisan willingness to pass reform and change the way social media works.

So, when you see all of those social media titans and Big Tech titans standing there on Inauguration Day, they have an incredible amount of interest about what happens in Washington right now, what regulation does or does not pass, and, very specifically — Laura is best situated to talk about this — but how Section 230 of the Federal Communications Act of 1996 continues to shape policy today and protect these companies.

AMY GOODMAN: So, what is, Laura, Section 230?

LAURA MARQUEZ-GARRETT: Yeah, so, Section 230 is a law that was passed in 1996, I believe, called the Communications Decency Act, right? So, the title, Communications Decency Act, or the Good Samaritan law, it essentially just says that these platforms, or internet service providers, rather, cannot be held responsible for what third parties are doing. And, of course, that was 1996, so the context there is you have a framework that is passive, a passive intermediary.

These social media companies create a new framework, new products. They don’t tell us what they’re doing. There’s no regulation, transparency, oversight — nothing. And ultimately, the products that our children and we are using today are not the ones that were in existence or addressed in Section 230. But they are still hiding behind that shield to quite literally say in court, “We have no duty to design a reasonably safe product for children under the laws of the 50 states.”

AMY GOODMAN: We ended the conversation in Part 1 by my asking you: How much do people like Mark Zuckerberg, Jeff Bezos, Sundar Pichai of Google understand? And you said, “Everything.” But can you elaborate further? And then I want to talk about particular cases and how these mega-companies deal with them, the deaths of children.

LAURA MARQUEZ-GARRETT: So, I said they know. Whether or not they understand, I mean, it’s a bit more of a subjective term, and I don’t know, because I don’t know how they can sit there and look at these spreadsheets and look at these recommendations from employees saying, “Hey, SSI, suicide and self-injury, here’s 20 things we could do to make our product safer for children,” and have these CEOs say, “No.”

MATTHEW O’NEILL: And you see it, right?

LAURA MARQUEZ-GARRETT: Yeah.

MATTHEW O’NEILL: The whistleblowers that are featured in the film are important voices. People have come out of these companies saying, “They know, we know we’re doing the wrong thing.” Memos from Arturo Béjar to Mark Zuckerberg and other top brass at Meta have come out in public, showing that this evidence exists.

AMY GOODMAN: Tell us about Arturo Béjar. I mean, very powerful testimony, because he was right in the inner circle. And what happened when he alerted the heads of the company? Because they responded to him every time before right away. And explain what he alerted them to.

MATTHEW O’NEILL: He lays this out in the film, because he made a memo that specifically pointed out exactly how young people were being harmed on the platform, and the overwhelming numbers, the percentages, what was happening each week, trying to call for change. And according to him, there was no response, not anything that happened or changed or even a reaction to his email. He’s been incredibly outspoken. He appears in the film in an insightful way, well worth listening to. But he’s also out there testifying in front of Congress. And it’s people like Arturo, people like Frances Haugen, who was another Facebook whistleblower, who are bringing light to these internal documents that show that it’s like cigarette companies. They’re trying to addict kids, and they know it.

LAURA MARQUEZ-GARRETT: And if I may, there’s a more recent whistleblower, Sarah Wynn-Williams, who wrote the book Careless People.

MATTHEW O’NEILL: Yes.

LAURA MARQUEZ-GARRETT: Chapter 44, it should be mandatory reading for every parent, teacher and doctor. Chapter 44 will tell you — it lays it right out there, right? You literally have Meta developing a technology where they can determine when teen girls, 13 to 17, are at their lowest, their most insecure, their most anxious, their most depressed. What do they do? They go to advertisers and say, “Hey, I got a great new product for you. When that kid is at their lowest, I can hit them with your diet ad, I can hit them with your makeup ad.” And she cites two memos, two meetings, when she finds this out —  right? — when she’s saying, “Hey, this isn’t OK.” And that’s what parents need to understand.

AMY GOODMAN: Wait a second. You talked about Sarah Wynn-Williams — 

LAURA MARQUEZ-GARRETT: Yes.

AMY GOODMAN: — and Careless People.

LAURA MARQUEZ-GARRETT: Yes.

AMY GOODMAN: Is it true she’s not allowed to talk about this book?

LAURA MARQUEZ-GARRETT: So, my understanding is that Meta went in and got a temporary injunction to say that she cannot publicize it, right? But — 

AMY GOODMAN: They can’t stop the publication of the book.

LAURA MARQUEZ-GARRETT: Already been published.

AMY GOODMAN: But the author can’t talk about it. Only everyone else can.

LAURA MARQUEZ-GARRETT: Yes, but I will say — yes, of course, because they don’t want you to see what’s in the book. Chapter 44, don’t forget that chapter. Seven pages, if you’re only going to read it, that’s it. But 100% I will say that the district, or the district court, the MDL, recently determined that she can be deposed for the litigation. So, the attorneys, we will be able to ask her questions, because it’s —

AMY GOODMAN: For your litigation?

LAURA MARQUEZ-GARRETT: For the litigation that’s going on in the country, right? It’s the MDL, the state and federal cases.

AMY GOODMAN: What do you mean, ”MDL”?

LAURA MARQUEZ-GARRETT: Sorry, multidistrict litigation. It’s this mass tort system. And that’s the litigation, right? The attorney generals have lawsuits, individuals have lawsuits, and school districts have lawsuits.

AMY GOODMAN: You mentioned Frances Haugen. I want to go to her for a second. In 2021, folks may remember that the Facebook whistleblower, Frances Haugen, sort of blew the roof off of a lot of this when she testified before Congress. She said that the company repeatedly prioritized profits over safety.

FRANCES HAUGEN: During my time at Facebook, first working as the lead product manager for civic misinformation and later on counterespionage, I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolved these conflicts in favor of its own profits. The result has been more division, more harm, more lies, more threats and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people. … It is about Facebook choosing to grow at all costs, becoming an almost trillion-dollar company by buying its profits with our safety.

AMY GOODMAN: That’s Frances Haugen. She was testifying before Congress, a Facebook whistleblower. If you, Laura Marquez-Garrett, can talk about the significance of what she said? She really shattered a lot there.

LAURA MARQUEZ-GARRETT: Yeah. So, what she said and what she did, look, it took an incredible amount of courage to do what she did. And I don’t know that I would be here or many others would be here, if she hadn’t, because we didn’t know what was happening. And so, what she does is she takes screenshots, of not even everything, right? Just a teeny amount of stuff, that is just so deeply troubling, knowing that doing so is going — you know, she’ll never work in the tech industry again. It’s a very — you don’t whistleblow. It’s very frowned upon culturally.

AMY GOODMAN: And the screenshots — 

LAURA MARQUEZ-GARRETT: Yeah.

AMY GOODMAN: — she took were of?

LAURA MARQUEZ-GARRETT: Thousands of documents of all sorts of things, including harms to children. One of them, in fact — and I’ll just give an almost direct quote. It’s about the user recommendation algorithm. It is a direct quote from a Facebook document that says, “In the past, the PYMK” — the algorithm — “has contributed up to 75% of inappropriate adult-minor contact.” That’s a direct quote from a document

AMY GOODMAN: I want to go to Toney and Brandy Roberts’ 14-year-old, their 14-year-old daughter named Englyn, who died by suicide after struggling with mental health challenges. Her parents later searched her phone and found an Instagram video that depicted the very form of self-strangulation that she performed on herself. This is a clip from a 60 Minutes report.

SHARYN ALFONSI: As the pandemic played out, Englyn wrote about struggles with self-worth, relationships and mental health. One August night in 2020, just a few hours after Toney and Brandy kissed their 14-year-old smiling daughter goodnight, Brandy received a text from a parent of one of Englyn’s friends who was worried about Englyn and suggested they check on her.

TONEY ROBERTS: We went upstairs, and we checked, and her door was locked. And that was kind of odd. I took the key from the top, and we opened the door. And no Englyn. And when I turned around is when I found her. When you find your child hanging and you are in that moment in disbelief, it’s just no way, not our baby, not our child. And then, ultimately, I fault myself.

SHARYN ALFONSI: Why do you fault yourself?

TONEY ROBERTS: Because I’m Dad, supposed to know.

SHARYN ALFONSI: Prior to that night, you had no idea that she was depressed.

TONEY ROBERTS: Not — not even close.

SHARYN ALFONSI: He found an Instagram post sent to Englyn from a friend.

TONEY ROBERTS: There was a video. And that video was a lady on Instagram pretending to hang herself. And that’s ultimately what our child did. Because you ask yourself: How did she come up with this idea? And then, when I did the research, there it was. She saw it on Instagram. It was on her phone.

BRANDY ROBERTS: If that video wasn’t sent to her, because she copied it, she wouldn’t have —

TONEY ROBERTS: She wouldn’t have had an idea.

BRANDY ROBERTS: — had a way of knowing how to do that certain way of hanging yourself.

SHARYN ALFONSI: Nearly a year and a half after Englyn’s death, that hanging video was still circulating on Instagram with at least 1,500 views.

AMY GOODMAN: Wow. That is a powerful interview 60 Minutes did with Toney and Brandy Roberts about the suicide of their 14-year-old daughter Englyn. In the documentary Can’t Look Away, you go deeply into her case. You interview her parents. You follow them as they try to seek legal redress. And I want to ask you, Laura Marquez-Garrett, what redress do they have?

LAURA MARQUEZ-GARRETT: That has yet to be determined. That’s what we’re fighting for, right?

And one thing I want to add briefly on that is what we get on the back end. So, I’ve seen this young woman’s data from Instagram. I’ve seen — we get medical records. We get school records. It’s easy for parents to say, “Well, it won’t be me. It won’t be my kid. It’s not” — this is a child who had no signs and symptoms, had no mental health issues, had no risk factors, had — you know, and when you see the data, that’s where the proof is, right? That’s where the — when you see the addiction, when you see the usage, the usage at night, right? Even on this, we track — we can track the usage somewhat, just a sliver, and you will see that these kids are using when their parents don’t know, right? They’re using at school. They’re using at night. They cannot look away. And that sleep deprivation, combined with — right? — the content, which, yes, it was something a friend had sent, but this is content that was being sent to these young people, generally speaking, from Instagram.

And my last point is, it’s not only that that video is up. Toney finally got one of the videos to be taken down, and then he searched and found 10 others. It’s not just one video, 10 others with the similar names, right? So it’s not like they’re hiding what they’re doing. Meta just doesn’t care. They have the metadata on the back end. They have the videos. They have the capability to be like, “Oh, we know this is harming children. It’s been reported. We see it right here. But we don’t take it down.”

AMY GOODMAN: So, how do you restrict it?

LAURA MARQUEZ-GARRETT: Restrict what?

AMY GOODMAN: You’re saying, “Just take it down.”

LAURA MARQUEZ-GARRETT: Well, you don’t. I mean, that’s the problem, is you don’t restrict it, right?

AMY GOODMAN: You’ve got two things. You’ve got the facts. You’ve got the videos, for example. You can take them down. But you’ve also got the algorithm.

LAURA MARQUEZ-GARRETT: Well, and you have the — well, and you have the users, though, right? So, you have the users; you can take those accounts down. You have the algorithm, which is their programming for engagement over safety. And so, that is a conscious choice. When you go on to a traditional search engine algorithm and you look for something, that’s what you get. When you go onto social media and you look for something, Mason Edens being a great example, you know, motivational speeches, inspirational quotes, “nobody loves you and ever will,” right? It’s they are not showing us what we want to see, but what we can’t look away from. And think about that from a consumer choice perspective. We need to get that control back.

AMY GOODMAN: How does this compare? How does Big Tech and the addiction of children, Matt O’Neill, director of — co-director of Can’t Look Away, compare to Big Tobacco?

MATTHEW O’NEILL: I think my co-director, Perri Peltz, has called it multiple times a public health crisis. This is something that our children are facing, the same way when I was a kid in the early ’90s and I was attracted to Joe Camel. Our kids cannot escape.

AMY GOODMAN: And now, that’s not minor, because Joe Camel was a cartoon character meant to
addict you, meant to track you.

MATTHEW O’NEILL: Exactly, designed to entrap children. And I think that’s a big part of what we need to address here. And when you hear the pain in Toney’s voice there and Brandy’s voice about the loss of their daughter Englyn, I think that’s a really important thing to emphasize, because what you see in Can’t Look Away, the film, is how these families are continuing forward. They’re transforming an unfathomable loss, and they’re putting it towards activism. You hear “lawsuit,” and you think of remuneration. What they’re asking for, there’s a financial element, but they’re asking for change. And they’re demanding change from these companies, first and foremost, so that no other kids have to die.

LAURA MARQUEZ-GARRETT: No amount of money will ever bring their children back. And every time they tell these stories, it breaks their heart again. And they do it to protect my children, to protect everyone’s children.

AMY GOODMAN: Last year, the former U.S. Surgeon General Vivek Murthy called for Congress to require warning labels on social media platforms, warning that use can harm teenagers’ mental health. Previously, he had issued an advisory warning of social media’s profound risk of harm for young people, calling for immediate action from lawmakers, tech companies and parents to keep kids safe. Murthy said young people’s brains are especially vulnerable to the detrimental effects of peer pressure and constant comparison. That constant comparison, where does that fit into social media platforms?

MATTHEW O’NEILL: I can’t imagine what it’s like to be a child today and to be able to see where all of your peers are on Snapchat, to be constantly comparing yourself to others on Instagram. It’s an entirely different experience of growing up. But sometimes that idea of comparison or social interaction dominates the conversation, when we need to focus on the algorithm and what’s being pushed to children, because, to a certain degree, if my 13-year-old, or almost 13-year-old, wants to interact with his friends, we’re not anti-technology. We’re not even anti-social media. This is a way that people can connect and make change and do good things in the world. What we’re talking about is keeping children safe and making sure they aren’t fed harmful content.

LAURA MARQUEZ-GARRETT: And also, I want to say, in addition to the algorithm, there are specific features. Look, Meta has this like button. And a couple years back, they ran these testing called Project Daisy, where they determined that if they hid the like button, if they gave users that setting and it was hidden, that it would help improve teen girls’ mental health. And ultimately, then you see them saying, “Yeah, but advertisers don’t like it, because you can’t see the engagement, and influencers don’t like it.” So you know what they do? They don’t launch. They even have all of the press set up to launch. They don’t launch, because they don’t care. It’s money first, profits over people.

AMY GOODMAN: I want to go to what Australia did.

LAURA MARQUEZ-GARRETT: Yeah.

AMY GOODMAN: Late last year, Australia passed legislation banning social media use for children under 16 years old. This isn’t Bell to Bell, what states are doing increasingly in this country, that kids can’t have their phones at school. They banned social media use for kids under 16. Prime Minister Anthony Albanese touted the first-of-its-kind law as he spoke from Canberra.

PRIME MINISTER ANTHONY ALBANESE: World-leading action to make sure social media companies meet their social responsibility. Social media is doing harm to our children. … Platforms now have a social responsibility to ensure the safety of our kids is a priority for them.

AMY GOODMAN: Platforms like TikTok, Facebook, Snapchat, Reddit, X and Instagram could be fined up to $33 million U.S. dollars if they systematically fail to prevent users younger than 16 from having accounts. It’s unclear if the ban will affect ties between the U.S. and Australia. Elon Musk, of course, richest man in the world, close Trump ally, blasted the ban as a back door way for the Australian government to control internet access. In response, the Australian prime minister said of Musk, quote, “He has an agenda.” What would you say, Laura?

LAURA MARQUEZ-GARRETT: This is not about the internet. It’s about health-harming products which we refer to as social media products. And what I would say for parents, for anyone watching, you know, when my children turn 18, I’m not going to say, “Happy birthday! Here’s a pack of cigarettes,” right? I may not be able to stop them, but I’m not going to encourage them. I’m not going to allow it, because what I’ve seen in the last three years, on the back end, the data, the records, the children, hundreds of children I’ve spoken to, and I’ve heard their stories, I am not going to put my children in harm’s way.

AMY GOODMAN: Tell us the story of Jordan DeMay.

LAURA MARQUEZ-GARRETT: Jordan DeMay, right? Star football player, again, no risk factors, no emotional, mental health issues reported. And one night, one night, he’s preyed upon by a predator on Instagram. “Sextortion” is how it’s referred to. This person convinces him, poses as a young woman showing interest, gets an explicit photo, and then, the moment that photo is exchanged, says, “OK, now I’m going to show this to everyone you know.”

And let’s talk product features. Instagram does something that not every platform does. It makes your follower and following list visible. For years, people have begged them, “Please, don’t make it visible.” That’s an engagement tool. It’s not necessary to the operation of the platform or even communication. In these sextortion cases, the difference between a platform where a predator can get an explicit photo and a platform where a predator can get an explicit photo and then say, “I’ve screenshot every member of your family” —

AMY GOODMAN: So, he thinks he’s a girl?

LAURA MARQUEZ-GARRETT: That’s right, the predators.

AMY GOODMAN: Rather, Jordan thinks the person is a girl — 

LAURA MARQUEZ-GARRETT: Yeah.

AMY GOODMAN: — and she’s begging him —

LAURA MARQUEZ-GARRETT: Yeah.

AMY GOODMAN: — for naked pictures.

LAURA MARQUEZ-GARRETT: Yeah.

AMY GOODMAN: They’ll trade naked pictures.

LAURA MARQUEZ-GARRETT: And there’s — they send — and she’ll, the predators will, send videos. They have all this stuff because of social media. They will prey on these children, typically high school athletes, which sometimes, by the way, they find through high school Instagram pages. You can collect that data very quickly. And these children are sleep-deprived. And in that moment, when they say, “I’ve got every friend, family member, co-worker, classmate, and I’m about to send it to them” — I’ve even seen cases —

AMY GOODMAN: This naked picture you sent of yourself.

LAURA MARQUEZ-GARRETT: Yes. That’s the moment these children feel like they have no way out, right? That’s a feature. That’s the follower and following list that Instagram makes publicly viewable.

AMY GOODMAN: And Jordan actually writes — 

LAURA MARQUEZ-GARRETT: Yes.

AMY GOODMAN: — “I’m killing myself.”

LAURA MARQUEZ-GARRETT: Yes.

AMY GOODMAN: And the response was?

LAURA MARQUEZ-GARRETT: “Do it, or I’ll make you do it. I hope you do, or I’ll make you do it.” We have one case where a young man, it was about 27 minutes from start to finish. He was studying for his DMV test. And then this, what appears to be a beautiful, young woman approaches, also an athlete, high school athlete, by the way, star athlete, right? Star athletes are used to being approached by people, so it doesn’t seem unusual. And again, it’s somebody who then, through Instagram, has the follower and following list and says, “All right, now I’m going to send it to everyone you know. Give me $1,000. You have 20 minutes, or it goes to everyone you know.” It’s not — these are their frontal lobe not being developed. It is an impulse decision. It’s not well thought-out. It’s not well reasoned. It is a child who is desperate and afraid and about to be globally humiliated. And if Instagram would simply hide those features for all users, hide the follower and following lists, those numbers would go down.

AMY GOODMAN: What would make them?

LAURA MARQUEZ-GARRETT: That’s a great question. Not dead children, apparently.

AMY GOODMAN: I want to go to Facebook’s founding president, Sean Parker. This is at an event in Philadelphia in 2017, Axios. Sean Parker said the site was deliberately designed to hook users.

SEAN PARKER: That thought process was all about: How do we consume as much of your time and conscious attention as possible? And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content. And that’s going to get you, you know, more likes and comments. And it’s a valid — it’s a social validation feedback loop, that it’s like a — I mean, it’s exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. … It literally changes your relationship with society, with each other, with — you know, it probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.

AMY GOODMAN: So, Matt O’Neill, this is not a whistleblower. This is Facebook’s founding president, Sean Parker, explaining, in an event with Axios back in 2017, how the whole system works. If you can talk more about this? And also — and maybe this is a question for Laura — is this most effectively dealt with at the state level or the federal level?

MATTHEW O’NEILL: The phrase that sticks with me from what Sean was saying there is “exploiting a vulnerability.” And we’re talking about big companies exploiting vulnerabilities in children. And that’s what people need to address. And our children are not safe unless something changes. We think we can put them to bed at night and tuck them in, and they’ll be safe, but instead, they have access to drug dealers and sextortionists and all sorts of horrible things that are being fed to them by social media. When we think about change, and I think that Laura can talk about legislation more broadly, but we should also think about our own activities and how we encourage this use and how we use these products ourselves. Are we going to participate as consumers in a system that is exploiting vulnerabilities in children?

AMY GOODMAN: Laura Marquez-Garrett, how much money do these large companies, Google, Meta, all of the different ones, Facebook, etc., pour into fighting any regulation?

LAURA MARQUEZ-GARRETT: Millions. They’re —

AMY GOODMAN: And how much do they profit? For example —

LAURA MARQUEZ-GARRETT: Yeah.

AMY GOODMAN: — during the Trump years, I mean, the watchword is “deregulation.”

LAURA MARQUEZ-GARRETT: There’s a number out there. I don’t know what it is, but I think it’s tens of millions potentially in the last year alone, you know, and it’s — right, and they’re making billions. Every month they delay, they’re making millions. Every year, it’s billions.

I want to add on the dopamine hit, something that I had not realized and I looked it up a year ago. That’s actually the mechanism that makes tobacco addictive. It’s these dopamine hits to the brain. So, now we know what addiction looks like in a 30-year-old. Stick that in a 12-year-old, and it’s deadly. And these companies are doing this by design. In fact, they actually have hired many, many neurologists and psychologists, not for trust and safety teams per se, but product development decks, product development decks with brain scans, right? So, professionals, medical field professionals, that, frankly, at some point we should start talking to them, as well, and be like, “Why? Why are you helping with product development?” These are products that are designed to addict your child. It’s what they do.

AMY GOODMAN: So, as we begin to wrap up, talk about Social Media Victims Law Center in Seattle, exactly what it is.

LAURA MARQUEZ-GARRETT: Yeah.

AMY GOODMAN: In your latest case, I mean, the center is representing some 4,000 kids.

LAURA MARQUEZ-GARRETT: Yeah. So, we are a small firm founded in late 2021 by Matthew Bergman, a longtime plaintiff’s attorney who has defended, you know, in the asbestos field and sort of held those companies to account. And when he saw when Frances Haugen came forward, and he saw those documents, he just said to himself, “This is not third-party speech. This is not our children having access. This is companies exploiting our children and pushing this on them, without them making that conscious choice.” And so, he filed the firm’s first complaint in January 2022. I joined in February 2022.

And, you know, you asked about latest case. And honestly, name a harm, name a platform. You know, we have them all. It is a constant flood. It is a constant flood. And the cases we file, they go through a rigorous intake process, right? These are kids that have had medical treatment or have died. These are individuals where we look at the records, we look at the facts, and we can say, “Hey, this is related to social media. This is not a child who” — for example, fentanyl poisoning. We don’t pursue cases against Snapchat where these children were using drugs before Snapchat use started, or where they went to Snapchat to find drugs, right? That’s not what this is. These are kids who went to Snapchat for the funny filters, and what Snapchat gave them, for its own purposes — 

AMY GOODMAN: And explain funny filters to people who don’t use Snapchat.

LAURA MARQUEZ-GARRETT: Yeah, well, and I actually love that you ask that question, because parents need to know. Don’t play with the funny filters with your kids. Snapchat is known for these. You take this photo, and you can have the silly puppy face and the — all the funny filters.

AMY GOODMAN: All these cartoon faces.

LAURA MARQUEZ-GARRETT: Yeah, the poop emoji, right? And kids love it, and it looks like fun. And if that’s all it did, not a big problem. But so, the problem is — before I did this work, I’ve played with the funny filters with my kids — it normalizes it, and it turns Snapchat into something that feels fun and safe and trusted. And when they turn 12 or 13 or 14 or 16 and they get that Snapchat, it is not going to be about the funny filters. It is going to target them. It is going to — they’re going to start getting direct messages or snaps from predators, particularly if they’re young children, not adults. As an adult, open an account. You probably won’t get a lot of that. As a child, you will. It is programming. It is design.

AMY GOODMAN: So, your organization is the Social Media Victims Law Center. And, Matt, your film is Can’t Look Away, that features Laura and also many of the child victims. Talk about how you’re distributing this film, Can’t Look Away, because I can’t imagine many streaming platforms, where you’re raising so many questions about regulating the streaming platforms, not to mention the social media products, want this film.

MATTHEW O’NEILL: So, I think this film has been, from the beginning, supported by Bloomberg News and based in the journalism of Bloomberg News, which is incredible. And we’re launching today on —

AMY GOODMAN: And the reporter on whose work this is based, Olivia Carville —

MATTHEW O’NEILL: Olivia Carville.

AMY GOODMAN: — is getting a George Polk Award.

MATTHEW O’NEILL: Yes, Olivia. I think right now she may be at the George Polk breakfast. But Olivia’s reporting and the rigor of the Bloomberg News newsroom helped set the stage for what we did journalism-wise in this. But the independent distribution we’re pursuing right now is with a group called Jolt.Film, because independent cinema, these films that are taking on entrenched powers, may not be popular with the entrenched powers themselves. So, this is a way, in Jolt.Film, that you can get this film. You can share it with friends. You can spread the word. There’s an education guide, and there’s also a resource guide, because some of the difficult topics that are in this film are things that need to be addressed carefully. So there are resources there for you and your children. And we’re distributing independently, because there needs to be a place for films like this to reach audiences.

AMY GOODMAN: I want to thank you both for being with us. Can’t Look Away is the new documentary. It’s co-directed by Matthew O’Neill and Perri Peltz. Laura Marquez-Garrett is a lawyer with the Social Media Victims Law Center. She is featured in this film. The film, again, streaming on Jolt and opening at New York City’s DCTV Firehouse Cinema.

To see Part 1 of our discussion, go to democracynow.org. I’m Amy Goodman. Thanks so much for joining us.

The original content of this program is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Please attribute legal copies of this work to democracynow.org. Some of the work(s) that this program incorporates, however, may be separately licensed. For further information or additional permissions, contact us.

Up Next

“Can’t Look Away”: New Documentary Examines How Social Media Addiction Can Harm — Even Kill — Kids

Non-commercial news needs your support

We rely on contributions from our viewers and listeners to do our work.
Please do your part today.
Make a donation
Top