After the Center for Countering Digital Hate reported that hate speech has soared on the website formerly known as Twitter, now rebranded as “X,” Elon Musk responded by filing a lawsuit against the center over the research, calling the group “evil” and its CEO Imran Ahmed a “rat.” X accuses the watchdog group of unlawfully accessing data to “falsely claim it had statistical support showing the platform is overwhelmed with harmful content.” This comes as Musk has laid off about 80% of the workforce at X, including a large number of content moderators, and shut down its Trust and Safety Council. “When there is hate and disinformation being algorithmically amplified into billions of timelines, it’s perfectly right that people that oppose the spread, the production and distribution of hate seek to research it and seek to put that out into the public sphere,” says Ahmed. While Musk calls himself a “free speech absolutist,” silencing critics is his “go-to tactic to avoid accountability,” says Nora Benavidez, senior counsel and director of Digital Justice and Civil Rights at Free Press.
AMY GOODMAN: This is Democracy Now!, democracynow.org, The War and Peace Report. I’m Amy Goodman, with Nermeen Shaikh.
Hate speech, racism and lies soared on the website formerly known as Twitter, after it was taken over by billionaire Elon Musk last year. That’s according to the Center for Countering Digital Hate. Musk has responded to the findings by filing a lawsuit against the center over the research. Elon Musk recently changed the name of Twitter to the letter “X,” accuses the British [sic] watchdog group of unlawfully accessing data to, quote, “falsely claim it had statistical support showing the platform is overwhelmed with harmful content.”
Twitter gives its Twitter Blue subscribers, who pay to use the platform, “prioritized rankings in conversations and search.” The Center for Countering Digital Hate looked at 100 different paid Twitter Blue accounts and found the company failed to act on 99% of hate posts they made. This comes as Musk has laid off about 80% of the X workforce, including a large number of content moderators, and shut down its Trust and Safety Council. Meanwhile, Musk, who calls himself a “free speech absolutist,” has reinstated the accounts of numerous white supremacists, bigots and conspiracy theorists. Last week, he reinstated rapper Kanye West, now known as Ye, who was removed for a series of antisemitic rants.
For more, we’re joined by two guests. Nora Benavidez is senior counsel and director of Digital Justice and Civil Rights at Free Press. And Imran Ahmed is CEO for the Center for Countering Digital Hate, which Musk is suing.
We welcome you both to Democracy Now! Thank you so much for being with us. Imran, we’re going to begin with you, as Musk is suing you. Can you explain what you found, and then your response to the lawsuit?
IMRAN AHMED: Well, thank you for having me on. And just to clarify, we’re not a British watchdog. We’re a U.S. 501(c)(3) headquartered in Washington, D.C. We do have a British office, as well. And clearly, I’m British.
The work that we’ve done since the takeover by Elon Musk is to look at what happened when he took over. So, what were the changes in the scale of hate and disinformation on his platform, and also their moderation of that hate and disinformation? All platforms have community standards. The question is that those standards, which are our responsibilities as users, and also, therefore, a reciprocal right, that we expect others to abide by them, as well, and for the platform to enforce those rules — how effectively are they being enforced?
Our first piece of work that you mentioned there was one that looked at the increase in the use of hate speech on that platform, so specifically racial slurs. And what we found is a substantial increase in the use of hate speech on that platform in the weeks and months following Mr. Musk’s takeover.
There’s other research that we’ve done to test their moderation capabilities. What we’ve done is we’ve literally gone out and asked — we’ve reported hate on the platform, extreme hate, like, you know, the sort of stuff saying, “Go out and shoot gay people,” or the kind of, you know, really, really appalling and abominable antisemitic conspiracy theories that were being — that you were just referencing in the most recent package that you had on. And then we went back and we audited what action they took. Now, with Twitter Blue users, 99 times out of 100, they took no action on extreme hate. And why? Because, you know, one can theorize, very simply, that that’s because for these $100-a-year accounts, these $8-a-month accounts, that they’re simply not willing to enforce the rules on them for fear of losing that revenue stream.
It just shows that under Mr. Musk, the platform has become more toxic. It’s got more disinformation, more hatred on there. And nine months after our initial piece of research, he sued us about it.
NERMEEN SHAIKH: And, Imran, I want to ask you about the piece that you wrote for MSNBC, headlined “X misses the mark in threatening to sue my group for documenting online hate speech.” In the article, you speak specifically about the case of the artist formerly known as Kanye West, who now goes by Ye, I guess, Y-E. He had been taken off the platform but has since, very recently, returned. Explain why he was deplatformed and then why he has returned.
IMRAN AHMED: Well, he was deplatformed for extreme antisemitic content that he posted. And, of course, you know, there’s been a lot of coverage about his particular opinions on Jewish people and his praise for Hitler, etc.
But why he was let back on is part of a wave of the replatforming of, giving back accounts to people who had been kicked off by either the old administration or the current administration — which, of course, does enforce its rules sometimes — because, frankly, controversy drives attention. I mean, Musk has said many times that Twitter will survive if people are paying attention to what’s happening on there. And the best way to do that is stuff that really raises our ire, in his view, so hate, disinformation, stuff that makes us go, “That’s unacceptable,” get into arguments with people, go back and check whether or not people agree with us or someone else. Really, he’s looking for eyeballs. And he’s willing to use extreme controversy, even — the algorithms, of course, funnel the fringes, funnel the most controversial content, the most engaged-with content, into more timelines. So he’s turning it into a sort of a nonstop car crash, with the hope of getting eyeballs so he can place ads on those, on that content. And that is really the business model. It’s incredibly cynical.
The problem for him is, advertisers don’t want their content next to that. And he’s lost tens of millions of dollars as a result. Now, that’s what he’s put in his lawsuit against us. And he’s blaming us directly for having caused the loss of those hundreds of millions of dollars — those tens of millions of dollars, because we documented the hate on the platform. You know, my contention is, all we do is hold up a mirror to Mr. Musk and say, “Do you like the reflection you see in there?” And rather than do what anyone responsible would do, which is say, “There’s a problem. I need to fix that. This is mine,” he said, “I’m going to sue the mirror.”
NERMEEN SHAIKH: Well, Imran, I mean, you’ve said, of course — and he has — Musk, Twitter have lost tens of millions of dollars since he took charge. But despite that, or maybe because of it, Musk says that he wants to change, expand the platform, now called X, into an everyday app akin to China’s WeChat, which includes not just messaging, but also — or, posting, but also messaging, payments, videos and other forms of media and exchange. If you could comment on that?
IMRAN AHMED: You know, if Mr. Musk has plans to expand his business, I wish him all luck with it. I have no problem with him making money. I have no problem with people doing business. What I have a problem with is that when there is hate and disinformation being algorithmically amplified into billions of timelines, it’s perfectly right that people that oppose the spread, the production and distribution of hate, seek to research it and seek to put that out into the public sphere. That is, after all, our First Amendment right to do so. And Mr. Musk’s attempt to strategically litigate to stop us from publicly participating — so, a SLAPP suit — you know, that is really a — it’s incredibly cynical, coming from someone who purports to be very pro-speech, and to do so by going after civil society.
Look, you’ve got Nora on, as well, who’s, you know, a friend. And our organizations and many others do research on these platforms, do advocate for better experiences for users on those platforms with less hate and disinformation, and, of course, because of the huge human rights and civil rights implications of the spread of hate and disinformation. I don’t want to make the obvious connection to what we saw in Pittsburgh or what we saw in Colorado Springs, but, you know, there is a very strong public interest reason to expose what happens on these platforms. And it is vital, therefore, that CCDH resist this lawsuit. He’s trying to crush a small organization. I mean, he is the world’s richest man, and we are an organization that’s relatively small. We’re a 501(c)(3). We rely entirely on public donations. And we’re going to have to make sure that we don’t allow him to — that we don’t allow this litigation to succeed, because if it does, it will have a broader chilling effect across anyone who wants to comment on the businesses Mr. Musk runs.
AMY GOODMAN: So, Elon Musk calls you a “rat,” Imran Ahmed, and calls your organization “evil.” You have a powerful lawyer representing you in Robbie Kaplan, who represents E. Jean Carroll. Are you getting increased death threats? Is your organization being threatened?
IMRAN AHMED: I mean, if death threats came into the organization, I wouldn’t get to see any against me. But I know that, you know, we are perfectly aware that I have a responsibility to my 20 staff, who I care about, who are intelligent, brilliant advocates for our cause. And we do everything that we need to do to make sure that our staff are safe, psychologically and physically.
What I will say to you is that when a man like that calls you a rat, you know, he knows the language he uses. I mean, he may not be a stable genius, but he is a clever man, and he knows what he’s doing when he calls someone evil and a rat. And I’ll tell you something. You know, we’ve pushed back against that.
The amazing thing for me — and I’m not someone that is particularly needy of praise or affirmation, but so many people, you know, from the actor Mark Ruffalo through to all of our colleagues in civil society through to politicians, have stood up and said, absolutely, this is outrageous, because this is a group that’s set up — I set it up because my colleague, Jo Cox, MP, a 35-year-old mother of two, was assassinated by a far-right terrorist who believed in the “great replacement” theory, in Britain seven years ago. And I knew that social media, and the way that social media platforms were allowing and accelerating the distribution of conspiracy theories that led to my colleague’s death, you know, had a responsibility, and I wanted to make sure that they recognize that responsibility.
I think the irony of this entire situation is that in an attempt to shut down transparency on his platform and accountability, what Mr. Musk has done is created an unimpeachable case and a cacophony of support for federal transparency and accountability legislation in the United States. It’s unacceptable that in the European Union with the Digital Services Act, in the United Kingdom with the Online Services bill, that there is absolutely nothing in the U.S. to ensure that researchers have access to information on these platforms that we know have massive impact on individual psychology. We do a lot of work on the impact on kids, as well — it’s bad, you know? — and on our democracy. It’s unacceptable that they can operate in such an opaque and unaccountable way. And, of course, I think that by winning this case and by defending this case vigorously, we will create that broader case for federal transparency and accountability legislation.
AMY GOODMAN: I want to bring in Nora Benavidez into this conversation and to talk about Twitter, now known as X, internationally. Can you talk about its record in censoring ethnic minorities and Indian dissidents in Turkey, working with China to censor critics? Elon Musk claims that Twitter’s predecessors did the same thing. Is that true?
NORA BENAVIDEZ: Sure. Well, Amy, it’s wonderful to be here with you, and it’s wonderful to be here with Imran, as well, hearing this story and working closely with the center. It’s been an incredible journey watching. And I think the only response when we get attacked by bullies is to stand up and to come together.
So, I look at this long track record that Elon Musk has when it comes to silencing critics. That’s his go-to tactic to evade accountability. He has done that over and over again. He did it even before he took over Twitter. Back when he was running Tesla, he had a long track record of silencing people. He fired employees who spoke about malfunctions with their cars. He colluded with and partnered with the Chinese government to make sure that any malfunctions at all on listservs would be minimized. And that kind of bad behavior, that silencing of critics, has transferred to Twitter, or X. And now we’ve really seen eight or so months of just a long track record of Elon Musk going after anyone who tries to be critical of something he doesn’t like.
And so, as you say, he has a global track record of this. There was a BBC documentary which explored the targeting by the Indian government of the Muslim minority population. And what he did was he systematically worked with the Modi government to make sure that posts about that BBC documentary that supported it, that applauded the kind of attention that needed to come to that issue in India, was suppressed, minimized and downgraded on the platform.
He has other examples, of course, where he sort of cherry-picks content. When he doesn’t like what someone is saying, he uses the biggest platform in the world to make sure that people don’t see it. He removes journalists when he doesn’t like what they are saying or the way that they are covering him. He makes sure that competition with Twitter is also minimized. This is sort of a — as Imran said, a cacophony. And the chaos that Elon Musk has created means that it’s hard for us to track it, unless we have the researchers in place, the kinds of tools and the transparency that shine a light on what’s really happening in a company, that Elon Musk has done everything to hide from us, to evade accountability.
AMY GOODMAN: Nora Benavidez, we want to thank you so much for being with us, senior counsel and director of Digital Justice and Civil Rights at Free Press, and Imran Ahmed, CEO of the Center for Countering Digital Hate.
Next up, we’re going to Colorado. The ACLU there has sued the FBI and the Colorado Springs Police Department for illegally spying on a local activist and infiltrating a community organizing hub. We’ll speak to that activist and find out what happened. Stay with us.
AMY GOODMAN: That’s our first guest on Democracy Now! today, Cantor Michael Zoosman, reciting the traditional Jewish memorial prayer for the 11 Tree of Life victims in front of the U.S. Supreme Court.