Alarm is growing over how the world’s richest person, Elon Musk, is changing Twitter after he spent $44 billion to buy the influential social media platform. Musk fired nearly half of Twitter’s workforce in a mass layoff Friday that gutted teams dedicated to human rights, artificial intelligence ethics and combating election misinformation, just days before Tuesday’s midterm election. This comes after he met with over half a dozen civil rights groups amid concerns he will let misinformation and hate speech go unchecked. We speak with leaders from two of those groups: Nora Benavidez of Free Press and Free Press Action Fund, and Rashad Robinson of Color of Change. “Self-regulated companies are unregulated companies,” says Robinson, who along with Benavidez says Musk has exacerbated already toxic conditions at Twitter and failed to see the “real and porous relationship between the online world and this offline real world.” Both groups are urging advertisers to boycott Twitter unless Musk takes dramatic actions to safeguard rights on the platform.
AMY GOODMAN: This is Democracy Now!, democracynow.org. I’m Amy Goodman.
Alarm is going over how the world’s richest person, Elon Musk, is changing Twitter, after he spent $44 billion to buy the influential social media platform. On Friday, Musk fired nearly half Twitter’s workforce in a mass layoff that gutted teams dedicated to combating election misinformation, just days before Tuesday’s midterm election. In fact, he fired something like 3,700 workers. Those let go included Twitter’s [civic] integrity specialist, Kevin Sullivan, who led editorial planning for the 2022 midterms and tweeted, quote, “He couldn’t have waited till Wednesday? #Election2022.”
Hundreds of fired Twitter employees on special visas could be deported, like on H-1Bs. Others, who say Twitter failed to give them adequate notice, have filed a class-action lawsuit.
The U.N. high commissioner for human rights issued an open letter to Musk Saturday, urging him to, quote, “ensure human rights are central to the management of Twitter under your leadership,” unquote.
Meanwhile, after announcing Saturday it would start charging $8 a month for users to have a verification check mark on their profiles, Twitter said it would move the launch to November 9th, after the election. The move came after concerns the new subscription model for verified accounts would allow users to create Twitter handles impersonating political figures or news sources. In fact, some people actually impersonated Musk over the weekend to prove their point.
Elon Musk met last week with over half a dozen civil rights groups amidst concerns he’ll let misinformation and hate speech go unchecked. Media Matters, Free Press, dozens more groups urged Twitter’s top advertisers to boycott the platform if proper safety standards are not imposed. In response, General Motors, Volkswagen, Pfizer and General Mills have all paused advertising.
For more, we’re joined by people with two of the groups. Nora Benavidez is senior counsel and director of Digital Justice and Civil Rights at Free Press and Free Press Action Fund, lead author of the new report, “Empty Promises: Inside Big Tech’s Weak Effort to Fight Hate and Lies in 2022.” She’s joining us from the highly contested state of Georgia, from Atlanta. Also with us, Rashad Robinson, president of Color of Change.
Rashad, let’s begin with you. You met with Elon Musk. I assume this was a virtual meeting. Can you talk about who was there and what you demanded and what he promised?
RASHAD ROBINSON: So, along with other leaders from the Stop Hate for Profit coalition, the coalition that led the $7 billion boycott of Facebook in 2020, focused once again on the issues of disinformation and policy at the Facebook platform, we met with him. And so, it included folks like Derrick Johnson of the NAACP; Nora’s colleague Jessica González, the head of Free Press; leaders from The Asian American Foundation, LULAC, ADL — and so a mix of organizations and leaders. And so, we met with Elon Musk.
We came in with very — with three very focused — three very focused asks that were connected to this upcoming election. One was to not deplatform any of the folks that had been — not to replatform any of the folks that had been deplatformed, and not to replatform them particularly before the election, and, after the election, to have a really clear and transparent policy around how they were going to do it; to keep in place the election integrity unit and the election integrity infrastructure through the election and through certification; and to be more transparent and clear about this content moderation council he’s been talking about, and to be transparent both about the policies of it and its level of power and who would serve on it.
And he agreed to each one of those demands on the call — actually, surprisingly, basically said he agreed with everything that we said. We told Mr. Musk that he had to actually say this publicly, if we were going to be able to say anything about this meeting, in a way that really spoke to the fact that he made these agreements. About 1:30 in the morning the next day, he tweeted out, tagging the folks who were in the meeting, including myself, in a tweet, agreeing to these demands.
And it wasn’t 24 hours later, Amy, that he began to — we began to hear about the firing. We began to hear about other policy changes. There’s no way you can keep in place election integrity if you fire and let go the very people who are managing the election integrity work. The sort of changes in policy which are deeply abrupt, which speak to, I think, the larger challenges we have with companies that are self-regulated, which means that they are unregulated, and all the ways in which Mr. Musk has sort of engaged and behaved speaks to sort of a person who maybe watched a Broadway show or has a favorite team, a sports team, and has decided that if they were in charge, if they owned that show or they owned that team, this is how they would change things, this is who they would put in a particular role or put in a particular position.
And that’s what we’re dealing with right now, is someone who does not have the sort of knowledge or expertise to make these decisions. And while that happens with a lot of companies, we are marching towards an election with a huge communications platform that has a deep role in how information is shared and moved, and it will have deep consequences, any of these changes that we’re hearing, but particularly all the people that have been let go that were responsible for some of the issues that face this platform. The thing I will say is that Twitter wasn’t good before. Twitter wasn’t doing everything it needed to do before. Cutting almost half the staff makes everything even more challenging, moving forward.
AMY GOODMAN: Now, of course, we hear that he is asking some of the people he fired to come back. Nora Benavidez, can you weigh in on this issue of the firings? Now staff have filed a class-action suit saying large corporations like this, it’s illegal for them to do that kind of mass firing without any kind of warning. Also the fact that in one of his first acts as owner of Twitter, he tweeted out conspiracy theories attacking Paul Pelosi, who had been hammer attacked in his home and was in intensive care, citing a website that had promoted that Hillary Clinton died in the 9/11 attacks, and she had a body double running for president in 2016. Musk posted the article in response to a tweet by Hillary Clinton. Then he deleted it. The significance of all of this? And what control does civil society have over a private corporation like this?
NORA BENAVIDEZ: You know, I think we have to look at the very long track record that Musk has as an erratic CEO. He has taken extreme actions when he dislikes what people say, whether that is on Tesla phone calls, when he rallies his base on Twitter to respond to critics. He has this long trail of ways that he is unable to actually be present and make thoughtful decisions as a leader. The newest actions, in wanting, and then not wanting, and coming back and trying to buy Twitter, all indicate that he is, at best, erratic. And what we’ve seen over the last week, since he actually took Twitter private, has been very disturbing.
I would absolutely agree with Rashad that, you know, Twitter was not good before. Twitter was a toxic environment even before Musk. That’s part of what we look at in our “Empty Promises” report, trying to really identify how is Twitter, how are other major platforms performing ahead of the midterms. And what we found was that Twitter is in the bottom half of major platforms in protecting users. This is the most basic protections Twitter has already failed to provide users.
Since Musk came on, he has — his first move was to let go of some of the most senior executives: the head of safety, other CEO. He has taken actions then himself, as you say, to post and, I think, be a superspreader of conspiracy theory. When the assassination attempt on Nancy Pelosi occurred, he was so fast to goad his followers and others on the platform with citations to misleading information. He’s not an everyday person. He has a massive following. And so, to see someone like that, with such notability, and many people who feel he has credibility, be a superspreader of misinformation and conspiracy is deeply troubling.
Then we saw him lay off almost 50% of his staff last week. And he did so with no fanfare, quite a bit of lackadaisical moving forward to the new era of Twitter that he wanted to usher in. And in letting these teams go, I just want to be really clear who were the people that he has let go. He has let go teams that are part of the human rights people, ethical AI, accessibility — that means the people that are helping to make disability users more friendly on the platform. He has let go of communications teams, integrity, safety. I mean, the teams just keep going. And so, they have been completely gutted.
At this point they are flailing, and we are one day from the midterms. I don’t understand how someone who is the new CEO can, in one breath, say that he is committed to election integrity, committed, as Rashad said, all of the things that he promised when we met with Musk, and then turn around and take all of these actions. And so, we’ve really thought long and hard: How do we somehow catalyze accountability? How do we take some form of action to change things?
And so, what we’ve done, across dozens of civil rights organizations, is come together. We’re now at over 60 civil rights organizations that have come together with grassroots and corporate support. We launched Stop Toxic Twitter. That’s a campaign where we are urging advertisers to halt their spending on Twitter. When we think about this, it’s really with the impetus knowing that we need a moral imperative here, where hopefully these advertisers begin to see that their brands are damaged when they occur and are seen next to troubling hate, toxic content, misleading information.
And what we’ve seen, actually, now is a real groundswell of advertisers that are in fact pulling their ad spending from Twitter, whether it’s General Mills, General Motors, Pfizer, Audi, L’Oréal — there are many others now that have followed suit. And there’s sort of a domino effect here, where not only we have our corporate pressure, but there is that grassroots swell of support to say this is a place that, while toxic before, has become only more toxic since Musk took over, and something has to change.
AMY GOODMAN: On Saturday — oh, let me say one thing. You have these H-1B workers, hundreds of them, who now fear deportation because they’ve lost their jobs. Another — on Saturday, former Twitter CEO Jeff Dorsey apologized for the layoffs in a series of tweets. He wrote, quote, “I own the responsibility for why everyone is in this situation: I grew the company size too quickly. I apologize for that.” Several days after the 2020 presidential elections, Dorsey testified to the Senate Judiciary Committee about his view on the responsibility to its users.
JEFF DORSEY: We are required to help increase the health of the public conversation, while at the same time ensuring that as many people as possible can participate. And in order to do so, we need to make policies so that people feel safe and they feel free to express themselves, to minimize threats of abuse, of harassment, of misleading information, of organized campaigns to artificially amplify or influence a particular conversation. And that policy creation, that enforcement is challenging, but also it is more or less opaque to the public. And that’s where I think we have a gap. We have transparency around our policies. We do not have transparency around how we operate content moderation, the rationale behind it, the reasoning.
AMY GOODMAN: There’s a bigger question here, Nora Benavidez, and that is: Should these corporations be regulating themselves? I mean, this is the current town square for everyone.
NORA BENAVIDEZ: These companies are so large and, frankly, so unchecked in the power that they bring. One of the reasons that we’ve been trying to gather civil and human rights and other civil society groups together is to demand better of these companies. Left to themselves, we have seen that they simply don’t care. There is a very long track record of inaction, and sometimes even a refusal to acknowledge the role these companies play in fomenting violence in the real world. We went to the very brink on January 6th last year. Our democracy barely held on. And as we’ve already spoken about with the assassination attempt on Nancy Pelosi, we know that the perpetrator of that also was incited and inspired by rhetoric online. There is a very real and porous relationship between the online world and this offline real world. And yet these companies, over and over again, will turn their back, whether that is in testimony before Congress, whether it is in their own very bland statements about what they are doing with the elections this year.
They often act as if they are doing enough, and in doing enough, they are eager to protect democracy, to protect users. And what we have found is that it’s quite the opposite. These companies are failing to do even the most basic things for people. They are failing to make sure that their own backend systems and machine learning are not amplifying the worst content. We know that they are black boxes, so opaque that their transparency efforts are really the most meager steps towards some kind of lip service.
And so, we look at what is now the days ahead, both tomorrow the midterms and then the kind of rhetoric that we know will follow in the days after tomorrow. We know that we are going to see hate, conspiracy, lies continue to proliferate. And yet these companies, every election cycle, kind of string together their election integrity efforts, saying that they are doing enough, and yet we often then find evidence later that not only did they not commit to doing certain things, but that even their promises have been hollow. That’s really why we’ve been trying to pursue a much wider type of initiative, building across sectors, whether that is with our advertiser partners, with other human rights leaders and activists, building what is a large movement here.
AMY GOODMAN: I want to go to that issue of the advertisers. Rashad Robinson, Musk tweeted, “Twitter has had a massive drop in revenue, due to activist groups pressuring advertisers, even though nothing has changed with content moderation and we did everything we could to appease the activists,” he said. “Extremely messed up! They’re trying to destroy free speech in America.” Rashad Robinson, president of Color of Change, can you respond?
RASHAD ROBINSON: Well, Musk has actually met with some of the advertisers, met with coalitions that represent the advertisers, has had a number of conversations over the last couple of weeks with advertisers, with ad agencies, as well. And Musk has not done himself any, any services. These companies are looking at where they want to put their brands. They are looking at the stability of Elon Musk and of the company. They were given some of the same promises we, as activists, were given around content moderation, around the election cycle, about making sure their ads are not placed up against white nationalists or disinformation. And Musk has not made good on the promises he’s made to them. And so, while we’ve been pressuring and pushing, I’ve never had such an experience — and I’ve run a lot of these campaigns — where advertisers are very clear that they’re not getting what they need, and we don’t have to do the type of pushing.
I will say, just to pick up of what Nora was saying, is self-regulated companies are unregulated companies. And while we are doing this advertiser campaign, while we are pushing from the outside, the technology that has so much potential to move us into the future is dragging us into the past. And that is not unfortunate like a car accident. That is unjust, in its manufacture through a whole set of choices that our government has made about how companies are regulated. Make no mistake: Our cars are not safe because of the benevolence of the auto industry. They are safe because of the infrastructure and accountability that surrounds it, because there are people that evaluate and hold accountable. And right now, whether it’s the sort of algorithms that are not transparent, whether it’s the business models, decisions, whether it’s these companies getting to decide what the standard is in terms of moderation, in terms of accountability, whether it’s the fact that —
AMY GOODMAN: We have 10 seconds.
RASHAD ROBINSON: — because of laws that exist, they have a level of immunity to liability. We need a new set of engagement from Congress and the White House, because we can’t keep going to billionaires, begging them to protect our civil rights.
AMY GOODMAN: Rashad Robinson, president of Color of Change, Nora Benavidez with Free Press, we thank you both for being with us.
A happy belated birthday to John Hamilton! I’m Amy Goodman. Thanks so much for joining us.