
Guests
- Carole Cadwalladraward-winning investigative journalist.
We continue our conversation with investigative journalist Carole Cadwalladr about the rise of unregulated tech behemoths like OpenAI, the company behind ChatGPT that has been embroiled in scandals surrounding its generation of easily disseminated false information, and Palantir, the secretive data-mining company that now heavily contracts with the U.S. government to provide mass surveillance and data collection services.
Transcript
AMY GOODMAN: This is Democracy Now!, democracynow.org, The War and Peace Report. I’m Amy Goodman, with Nermeen Shaikh.
NERMEEN SHAIKH: How to Survive the Broligarchy. That’s the name of the new Substack by our guest, award-winning investigative journalist Carole Cadwalladr, a former reporter at The Guardian and The Observer.
AMY GOODMAN: Carole Cadwalladr gained international recognition for her exposé on the Cambridge Analytica data scandal of 2018.
Thank you for staying with us for Part 2 of this conversation. And here we thought we’d just go back in time. Talk about, at the time, what you discovered. And ultimately, who was held accountable?
CAROLE CADWALLADR: Well, that’s a very good question, to which there is not a good answer.
Basically, what happened is, in — it was the election, Donald Trump, 2016. I started this investigation that week into what we now understand is the fake — it was the “fake news” ecosystem, was the words that you used at the time. It was the — we began hearing this term “fake news,” and these articles, which had been totally made up and had — you know, we discovered, had spread across social media. And what I did in that sort of one week after Trump got elected, I did this first deep dive into understanding how these weren’t just some individual articles, but there was actually an entire network.
And in that first article that I did, I first heard the name of this company called Cambridge Analytica. And it was a data company, and it had publicly said that it had worked for the Brexit campaign and for the Trump campaign. And it was a really bizarre and peculiar company, because it wasn’t just a data analytics company. It actually had a 30-year history as a military contractor. It had worked for the U.S. government. It had worked for the U.K. government. It had worked for non-NATO countries around the world. And then, from — it was using influence to try and persuade populations. So, it’s called information operations or psychological warfare. It’s why — so, for example, if there’s a village in Afghanistan, and you have a choice — you either bomb it, if you’re trying to get rid of the Taliban, or you try and persuade the villagers, you drop leaflets. That was the whole — the way that it used to work.
And then, what happened over time, you realized that instead of dropping leaflets from airplanes or whatever, you could actually target people through technology, through the social media and other means. And that methodology was now being used by this company. It started using this in elections. So, it started working in elections in countries all around the world, and then it was selling its services to Western countries. And I stumbled into — it was like a rabbit hole, because I then found out that it had — one of the weapons at its disposal was a vast amount of Facebook data, which it had accessed without people’s consent. And it was then using this to target voters in order to profile them to send them individual messages.
So, that was the sort of nub of the company. But it was so dark and so murky. And when I wrote an early story about this, Facebook just completely denied it. They were like, “No, it — you know, yes, they might have had some data. They absolutely don’t have it now. This isn’t true.” And I just — and Cambridge Analytica said, “No, rubbish, not true at all.” And then they started to threaten to sue The Guardian. And then I found a whistleblower. So, I found this guy, Chris Wylie, who had been an employee in the company, and he actually had the receipts. He literally had the receipts for how they had bought — they paid a scientist to gather this Facebook data.
So, that was what we exposed in 2018. And it did have this amazing impact, in that people were, like, really freaked out, I think, that a company could have this kind of power to manipulate them without them knowing. And Mark Zuckerberg was dragged before Congress. That was the first time he appeared before Congress to answer questions. The FTC in America, it fined Facebook $5 billion, which was a record. There were multiple other ones. It was shown to be grossly illegal.
AMY GOODMAN: They were being paid by Cambridge Analytica for this information?
CAROLE CADWALLADR: So, it was — so, Cambridge Analytica employed a psychologist to gather the Facebook data, which it then deployed for political campaigns.
AMY GOODMAN: Steve Bannon was the vice president?
CAROLE CADWALLADR: Bannon was the vice president of this company, yeah, exactly. And so, nothing — but nothing happened. So, there was this massive scandal. It was shown to be completely illegal, you know, record penalties. Nothing actually changed — you know, no privacy legislation, no other kind of legislation. No executive from the company was actually held accountable.
There are still suits going on. There is a — two weeks’ time, there is a really fascinating lawsuit coming to the Supreme Court, which is shareholders of Facebook are still suing the company, because they say that the company lied to them, that they knew about this data breach well before and did nothing about it.
So, the consequences go on, but, essentially, what we saw, and the thing which I really took away from it, is that these Silicon Valleys have total impunity. Even when the kind of law works in the way that it should have done, there was no consequences, and the leaders of these tech companies have been free to pursue their ambitions unimpeded. And that is the situation we’re now in on steroids.
If you look at a company like OpenAI, that’s the methodology, right? So, there have been lawsuits about what OpenAI is doing, which is stealing content. It’s stealing people’s work from across — it’s stealing entire organizations’ back catalog. And organizations such as The New York Times are suing them. But that’s a really long process. It takes ages. And in the meantime, they’re just continuing. This is — they call it “move fast, break things.” I hate that phrase. We have to retire it, because it’s the law which they’re breaking. And now it’s the Constitution. And that’s — you know, that methodology has come into the U.S. government.
AMY GOODMAN: So, no one was held to account, but you —
CAROLE CADWALLADR: Oh yeah.
AMY GOODMAN: — were put on trial.
CAROLE CADWALLADR: Yeah. So, I exposed — I exposed the whole thing. And one of the things I exposed in this story was how the Brexit campaigns in the U.K. had broken electoral laws. They were found to have broken data laws. And then I uncovered that the funder of a Brexit campaign had an undisclosed relationship with the Russian ambassador in London, had had a series of meetings. And I wrote about this, and that was when — you know, it was a long-running investigation. I’d been on it for two years. But at some point, he decided to come after me. And he came after me personally.
AMY GOODMAN: He was?
CAROLE CADWALLADR: He’s a man called Arron Banks. And he’s the funder — he’s a close associate of Nigel Farage, the far-right Brexit leader in the U.K., who’s also close to Trump. So, I stood trial in the High Court because I was sued personally. They didn’t go after The Guardian, which had published the words originally. They didn’t go after TED, which was where I gave this talk. And yeah, so it was a very, very punishing experience.
But I think the key thing here is that this is a playbook. We’ve seen this in other authoritarian countries. Other journalists have been targeted in this way. This is coming for the U.S. right now, OK? So, we’ve seen Trump going after news organizations. We’ve seen that across the board. And we’ve seen news organizations, you know, pre-obeying in advance. The next stage is where he and his allies are going to come after individual journalists. And I really hope that news organizations in America do a better job of defending their journalists and their journalism than The Guardian did in my case, because it is really important that, you know, we have solidarity, that we don’t abandon people. You know, in my case, it was because I was officially on a freelance contract. It is really important that the leaders of news organizations defend their journalism, defend their journalists, protect their journalists, and actually for the public, as well. Local journalists in communities are perhaps in the frontline of this, and they’re the most vulnerable, because they’re living where they’re reporting, and they really need people’s protection and attention and support.
NERMEEN SHAIKH: Well, I want to ask also — I mean, you said the kinds of attacks that were directed at you, as opposed to, as you said, The Guardian or The Observer, you’ve called it, in fact, a digital witch hunt, saying that your biggest crime was reporting as a woman. So, if you could talk about that? And then, second, just to go back to the Cambridge Analytica story specifically, you said, you know, that Mark Zuckerberg was charged — I mean, Facebook was charged $5 billion. But also, Cambridge Analytica went bankrupt so soon after this exposé. So, what do you take from that about the possible consequences of this kind of, well, I mean, gross illegality, and what might happen if such a thing were to occur with Silicon Valley companies now?
CAROLE CADWALLADR: The first point of that was it wasn’t just — I wasn’t just targeted with a lawsuit. I also became the target of a massive, coordinated, online campaign of abuse and harassment and, you know, vicious, misogynistic language and violence, and day after day after day, which has an impact. You know, it has an impact in the way that I’m perceived. It has an impact upon me personally. Because I was silenced, because the lawsuit silenced me, I couldn’t actually respond to any of this. And it was — you know, I refer to it as being caught in, like, a washing machine, because every step of the litigation unleashed another torrent of articles and then abuse. And this is — I’ve made really good friends in other countries who this has also happened to, and it’s generally female journalists. They’re the easiest to go after. So, Maria Ressa, the amazing journalist in the Philippines, Rana Ayyub in India, they’ve both experienced much worse versions of this.
But the one thing I would say about it is that it really gave me this really visceral understanding of the power of these platforms and their unaccountability. When you get targeted personally, you really start to understand how these tech platforms work, how there’s no way — when there was a very violent, inciting video put up of me, there was no way for me to get that taken down off the platform, even though if somebody had done that in the street, they would be arrested. So, I think I really started to understand that from the personal to the bigger level. And where that is from the personal to the bigger level is that, as you say, Cambridge Analytica couldn’t survive the scandal. It shut down. But Facebook could, because they have billions and billions of dollars. They have armies of lobbyists. They have — I think it’s in the thousands of lawyers acting on their cases. And they get away with it.
And that is why we are where we are now. You know, they are now — there’s some really important cases which are now coming to trial in the U.S., antitrust cases, and we just have to hope that the courts do hold the line, do apply the law. And we will have to wait to see. But I think, for all of us, as citizens, as consumers of these products, we have allowed ourselves to be gulled into a sort of place of powerlessness, because there were great things here which we enjoyed using and were really helpful. But we are now entering a different era. We now have to understand that these companies are allied with an authoritarian government regime, whatever you want to call it. That is what is happening in the United States now. And so, everything that you are posting on Instagram, everything that you are putting out across social media, that is now a data point. It’s personal information that can become part of this silo, which is going to be feeding these massive databases.
NERMEEN SHAIKH: I mean, if you could talk about the implications of that, the fact that all of this information is handed over voluntarily? I mean, now Facebook has 3 billion users. So, obviously, that — you know, they’re individual people who have agency. They might be influenced, obviously, but nevertheless, it’s their choice. They put up a zillion pictures. Instagram is all images. So, what are the implications of that? This is not coercive, literally, by any means.
CAROLE CADWALLADR: Exactly, exactly. It’s just — it’s the economic model of how these companies work. But I think — so, just to be really clear about this, OK, of why this is such a particularly dangerous moment, which is, you have all of the data that the government holds on you. You think about that, your Social Security, your taxes, you know, the — everything the Treasury has, all of these departments. Then you have huge amounts of data, which these data brokers hold on you. This is commercially available data in the U.S. There’s no safeguards in it, that there are in other countries. And —
AMY GOODMAN: Like things you buy.
CAROLE CADWALLADR: Everything you buy — I mean, astonishing detail of every aspect of your life. You know, that is being gathered, and it’s just sold for advertisers. That’s what the business model is. But that same information could be repurposed for other uses. And then you have all of the private information about you, which is out across social media. And what we know is that this model, Palantir, which is the all-seeing, -encompassing eye, is taking these different silos of data, and it’s merging them. And that is the thing. And then it’s applying generative AI to that, and then it’s making assumptions about you from it. It’s making inferences. And it might not be right. You might have the name of somebody who is a suspected terrorist who’s being monitored by the CIA. Imagine if you’ve got the same name as them, and that gets mixed up in the system. Like, that’s just one example of it.
AMY GOODMAN: When you talk about generative AI, I mean, think about talking to an audience that is not familiar with artificial intelligence at all, that’s trying to keep up with everything you’re saying. They know they shop online. They buy things. They also get drugs, their medications online, or they go into a drug store and buy them. They post pictures of their parents, their children, their grandchildren, whatever. How is this all put together?
CAROLE CADWALLADR: You know, I wish I knew the details of that, but I don’t, because Palantir is a really, really shadowy military contractor.
AMY GOODMAN: And because we talked about Palantir in Part 1, just again, reprise.
CAROLE CADWALLADR: Yes.
AMY GOODMAN: Palantir owned by Peter Thiel. Explain.
CAROLE CADWALLADR: Yeah, so, Palantir is a company. It started out as a defense contractor. It claimed that it was the company that helped locate Osama bin Laden. OK, it claims that. There’s no evidence to suggest that’s actually the case. But it created this mythology about itself that it could use data to find and identify America’s enemies. That was the — and from there, it then sold itself to other industries.
AMY GOODMAN: Military contractor.
CAROLE CADWALLADR: But most distressingly, from my point of view — I live in Britain. I live in London. We have a, you know, supposedly left-wing Labour government. And we are in a situation in which Palantir is now contracted into our National Health Service. So, its software, Foundry, also says that it’s just really good at organizing data. It’s just —
AMY GOODMAN: It’s a military contractor.
CAROLE CADWALLADR: Yeah, it’s a military contractor. It works for the U.S. government and other governments. It’s — as we said, it’s actually working for Israel at the moment, in the conflict there. But it’s also in the heart of our National Health Service handling patient data. I mean —
NERMEEN SHAIKH: And then, when the prime minister came, as you’ve pointed out, Keir Starmer, the only person he met when he came to D.C., apart from Trump, was AI company Palantir’s head.
CAROLE CADWALLADR: And most distressingly, again, here, for me, is that our ambassador to Washington, Lord Peter Mandelson, who has worked for 20 years as a lobbyist for an extraordinary range of companies, including many Russian ones, he was a lobbyist for Palantir. And he was the person who was posted to Washington, and he is the person who brokered this meeting between our prime minister and this company. And there’s just a —
AMY GOODMAN: So, Starmer met with who?
CAROLE CADWALLADR: There’s just an — he met with the leadership of Palantir in Washington. There is this extraordinary naivety. This is the — it goes to our naivety about who these companies are and what their intentions are. And just as we got gulled into posting all of our photos across social media, because it gave us something, in the same way for the British government, it gives them something. It’s an organizational software, which it says is going to transform, you know, our massive and unruly National Health Service. It’s not thinking about what the consequences of that are.
So, just to give you one example, which I’ve talked about, is that, you know, I say don’t post photographs of your children on social media. Please stop it. They haven’t consented to this. You know, children are people, too, and they haven’t given their consent. Anybody who’s under the age of 18 — all right? — whose parents or relatives or friends are posting photographs of children, those children’s faces are being data harvested right now. And they are going to be living for a much longer time from us. They’re going to be living with the consequences for much longer. I really would say to American and parents in other countries: Do you want your child’s face to be taken into some database where that’s just another data point to be used in some way? I really don’t think you do. And the thing is, is that when you put data out into the world, you can’t then take it back. It’s just out there. That’s been harvested. That’s gone. So I think we all have to be much, much, much more thoughtful about what about ourselves we are putting out.
AMY GOODMAN: I mean, this may be just a simplistic question, but so what if that face is out there? Explain what you mean by data harvesting and following that face through time.
CAROLE CADWALLADR: So, I think we need to look at other countries. So, if you look at China, which we know is an authoritarian society, we know that the Chinese government harvested data from all of its citizens. In fact, it went for — it went after the most vulnerable population first: the Uyghurs. And facial recognition technology is at the heart of their system of how they surveil and control their population. All right?
And you look, there is a company in America called Clearview AI, which Peter Thiel, the owner of Palantir, also invested in, and they are the experts in facial recognition technology. They are already working for ICE. They are already using that technology to go after immigrants in order to target them and deport them. The thing you have to remember is that if they’re using this technology to go after immigrants, they then will use that technology to go after citizens, too. It’s like they are the — you go after the most vulnerable population with the least rights first. So, if anything that you see happening right now to immigrants in this country, what they’re doing, removing them unconstitutionally to terrifying concentration camps in other countries — that’s what that prison in El Salvador is — think about what the next stage in this process is. Think about whether you want the government to have your face in that facial recognition technology.
NERMEEN SHAIKH: But, I mean, is there a distinction? Because you brought up the example of China, and that’s a very good example. All of that information, it’s not in private hands; it’s entirely in state hands. So, does that make a difference? So, in other words, in China, people don’t have the option. I mean, that’s just — you’re registered, and then the state follows you, whatever you do and wherever you go. That is not the case here in the U.S., at least not yet. And then, the second point is that the examples that you’ve given, of course, they are already authoritarian states, and that’s not the case here. It has not been an authoritarian state. So, do you think that there are more restraints, potential restraints, on this kind of overreach of power, in a way that does not exist, for instance, in Russia or China?
CAROLE CADWALLADR: There are no restraints on this use of this data, because you have, apart from some in California, no legislation that protects your personal data. You don’t have it. You had a decade to sort this out. You didn’t.
And the idea that there is a distinction, because in China, it’s the state — so, what we saw at the inauguration, that image of the tech bros behind Trump on the dais, that was a sign. That was a signal, OK? “These are my — these are my — I control these people now. They are now part of my administration, and they now are going to do what I need them to do, if necessary.” And they did. Mark Zuckerberg did that great video where he went out, and he was like, “Hey, we’re no longer going to be doing any content moderation. You know, we’re going to let it fly.” So, they are already doing what Donald Trump demands. And that is why you have to understand that these companies are not your friend. They are now part of this infrastructure.
AMY GOODMAN: Can you explain what artificial intelligence is and how that fits into this picture, the data mining, what happens to it?
CAROLE CADWALLADR: So, you know, what we’re seeing at the moment with ChatGPT, it’s a prediction engine. It’s just like you put vast quantities of data together, and it can make predictions based upon what it’s found in these previous patterns. And we’re treating it like it’s the oracle, it’s the word of God. It is not. It is just — you know, it’s another — just think of it — you know, social media was AI, algorithms. You know, this is what machine learning and artificial intelligence already gave us, which is a world of division and teen mental health epidemic. You know, there are so many downstream consequences we’ve already seen from social media. And now that is happening at speed, at scale, with generative AI. And the bill, the “big, beautiful bill,” which is now saying states are going to have no power to legislate or regulate what is a totally dangerous, untested technology that is going to be used in ways, both at the country, state level, but also at the really personal level. You know, you look at the things, the impact that these social media algorithms have already had, especially on children, especially on teenagers. That’s — amplify that —
AMY GOODMAN: Example of that, of the effects on —
CAROLE CADWALLADR: Well, so, one famous example of that, and back to the, like, how nobody ever gets held to account for anything, The Wall Street Journal published a story last year, I think, about how it was connecting pedophiles on Instagram to young children. The algorithm was actually matching them together. And, you know, you think at that point that that might have given pause to either Congress or social media leadership. But, you know, this is the impunity that I’m talking about. And that is why we — you know, we’re accelerated to this situation that we’re now in. And AI is that on steroids. Everything that’s happening now, we’re seeing happening, is that on steroids.
NERMEEN SHAIKH: I just want to say, I mean, last night, and, I mean, this is very late in the day, I used ChatGPT for the first time, in preparation for this interview. And I asked it —
CAROLE CADWALLADR: This is a — this is a landmark. This is —
NERMEEN SHAIKH: Yeah, and I asked it to write a paragraph in the style of some of my favorite authors, but all of those authors write in another language, in German and Hungarian. So, you know, I’m asking them in English and asking them to reproduce or write something like these authors in English. And, I mean, it’s extraordinary. I mean, they capture not only extremely complex sentence structures, but also the sense, I mean, the language — not just the language, but also the kind of aura of the style. So, you also —
CAROLE CADWALLADR: I did bring —
NERMEEN SHAIKH: — asked ChatGPT to write your TED Talk, right?
CAROLE CADWALLADR: TED Talk, yeah.
NERMEEN SHAIKH: And then you wrote your TED Talk. But so, if you could explain what you see as the distinction between what you ultimately wrote and what ChatGPT produced? But also, then, you know, the broader question that you raise, which is: How can people be compensated or recognized for their information, their books, etc, art, that is available online, and therefore being used by artificial intelligence? Or do you think that’s not possible, likely?
CAROLE CADWALLADR: So, it was such a fascinating exercise, this, because I wrote this TED Talk, and then I had this idea at the end of, “Oh, I know, yeah, I’ll do this thing of, like — I’ll see what ChatGPT would write as a TED Talk in 'Write a TED Talk in the style of Carole Cadwalladr.'” Anyway, it pumped it out in seconds. And I was like, “Wow! That’s kind of, actually, the sort of general arc of what I talked about.”
AMY GOODMAN: And did it attack itself?
CAROLE CADWALLADR: No. Well, exactly. No, it didn’t, because, you know, as I say in the talk, it’s dumb, it’s stupid. And I was like, it doesn’t know that I’m actually going to turn to Sam Altman, who’s another TED speaker, and say, “This, this is not your work. This is not your data. This is my IP. I didn’t consent. How dare you?”
And, you know, I made the point, you know, this is — you know, it’s — at most basic level, this is just theft. OK? It’s going into somebody’s house, and it’s stealing their furniture and then selling it, and then taking the profit from it. That’s what generative AI is, because everything that these models are based on is somebody else’s work, somebody else’s words, somebody else’s creative output, you know. And that’s not just words. It’s music. It’s art. Everything is being fed into this engine. It is just theft.
But it’s actually more than theft, because this is where we come back to this idea around data. And, you know, years ago, somebody, this veteran tech, female tech journalist, said to me, she said — she said, “You know the problem with the word 'data,' it’s so boring. It’s so dry. It doesn’t convert.” She said, “I had an editor who said to me, 'You have to think of it as naked selfies. It's your — it’s you. It’s your — and it gives this really complete picture of you.’” And that’s what I mean in that they took this without consent. It’s an act of violence. And this is where, you know, I’ve used this word, which is controversial, but, you know, I call them data rapists, because it’s an act of violence, and it’s an act against the person.
And this is where, you know, I come to this, this, you know, which is being formulated by much cleverer people than me. But our data rights are really fundamental human rights. We have to think of it in that way. And violation of our data in this way is a violation of the person. And that’s why it’s so important to understand the really far-reaching impacts of this and to now defend it, really think about that. And we have to change our conception of ourselves in this online space.
NERMEEN SHAIKH: But what would you say to people who, you know, even as they express concerns for the rapid development of artificial intelligence, the absence of sufficient oversight on artificial intelligence, but still maintain, in certain sectors, in particular in the healthcare industry, that artificial intelligence has the capacity to completely revolutionize medicine, and therefore, one has to maintain, and not only maintain, but develop more and better ways of using AI in these fields?
CAROLE CADWALLADR: I think there are. I mean, I think that’s a really valid argument. And I think that’s why, you know, regulated, ethical, human-centered AI, by responsible companies who we can trust, does have enormous potential. That is not where we are. We are at the opposite of that. We have shadowy companies owned by people we cannot trust, because we can look at their past behavior. We can see there is no ethical thought about what the possible consequences are. Us, as humans, are not at the center of it. We’re not being asked to consent. And that is — that’s, you know, the place that we’re in. And what the bros are doing, what Sam Altman is doing, what these other people, they’re going into Congress, and it’s worked. They’re saying, “We can’t have any regulation, because that’s going to hold up progress. And if you hold up progress, China is going to win.” And it’s such — this is exactly the argument they use with social media and Facebook. This is the same one, and it’s bullshit.
AMY GOODMAN: Can you talk about the relationship between Altman, Thiel, Elon Musk, President Trump and his family? Because you’re talking about how all of these people are deeply involved with, you know, with this data mining, with artificial intelligence, and personally profiting off of it.
CAROLE CADWALLADR: Yeah. So, it’s really complicated. I mean, it’s a really — you know, they’re all massively ambitious, egotistical, narcissistic men in their own rights, with their own ambitions. But on some of their interests, they align. On some others, they don’t. So, you know, the relationship, we see that DOGE were the company — DOGE was the organization which brought in Palantir to the U.S. government. That’s what The New York Times reported. DOGE being headed by Elon Musk, Palantir by Peter Thiel, those two have a really long and complicated relationship. You know, I said that they were business partners back in the day, in PayPal. And they’re rivals, in some respect, but in others, their interests come together. They also have come together because they’re bidding for the Golden Dome contract, which President Trump announced — you know, this massive, multibillion-dollar defense infrastructure. On other ones, Sam Altman and Elon Musk, they’re huge competitive rivals. You know, they did work together. So, you can just see there’s all these different fractures and rivalries. And, you know, at the heart of it is this super, super, very macho, very aggressive, very, very ambitious, wanting to be first, wanting to be biggest, wanting to be best.
AMY GOODMAN: So, how do you survive the broligarchy? Because you also talk about the resistance. I mean, they will use the examples of you put in millions of MRIs, and you can really develop very fine predictions around cancer. They’re going to use that example when anyone raises concerns and says states should be able to regulate on their own. They say, “We can’t deal with 50 different entities regulating our advances in cancer research.” But, in fact, that is not actually the motivation.
CAROLE CADWALLADR: I’ve got a friend, and he’s really cross with me. He says, “Carole, it’s not how to survive the broligarchy. It’s how to beat the broligarchy.” And I’m like, “I know. This is really true. We have to change the messaging about this.”
You know, one of the key things, one of the really helpful things for me about being targeted and persecuted and harassed and put on trial and, you know, in existential risk of my losing my flat and my career and my financial livelihood was now the fact is, is that it’s — I experienced that powerless. And what I discovered is I wasn’t powerless. Actually, I was really powerful. That was why I was targeted. And I think that thing of realizing that we are more powerful than these guys. There are more of us. We have values. We have ethics. We know what we stand for. And we believe in, like, fundamental human rights. They don’t.
And so, that is why, you know, I think there’s partly a clarity to this, which is we have to say, “We don’t stand for this. This is not who we are. This is not what we want. These companies should not be at the heart, doing this at the heart of our government.” And we should respond, as well, with our own behavior. But I think first, the first step is to recognize what is happening, to call it out, to tell other people, to be aware, and, you know, to take what actions we can in real time, because this is consolidating now.
And I think, in terms of the Silicon Valley companies, that aspect of it, this AI is not inevitable in this form, but it is going to be coming for your job. And, you know, they’re selling this software into companies for the sake of efficiency, all right? But that’s — it’s — so, I’m just going to give you one really concrete example of this, OK? So, there was — at my news organization, The Guardian, they were selling off my section of it, The Observer. And we, we the journalists, we went on strike, because — first time in 50 years, because every — 97% of the journalists were like, “We are not OK with this. This is a way of getting rid of journalists, in disguise.” We went on strike. And what we discovered was that during our strike, the management used a tool based on ChatGPT, which had been trained on our work, to write headlines for articles to keep the newspaper going. So, it’s like they was using robots to break, you know, a legally protected industrial dispute. And that’s a tiny, tiny example. But this thing of that, this being replaced, being replaced by automated software, by robots, which have taken our work without our consent in this illegal fashion to now replace us, we should not be OK with any of this.
NERMEEN SHAIKH: So, you mentioned, you know, we should take what actions we can in the moment. I just want to go to the end of your — the last piece that you wrote for The Observer, which was headlined, “It’s not too late to stop Trump and the tech broligarchy from controlling our lives, but we must act now.” You come — towards the conclusion of the piece, you write, “These vast data-harvesting tech monopolies that control our online world were never inevitable. And there is another way. We can go back to the future, to the democratic, inspiring, non-corporatised web that Wales proved was possible.” So, if you could elaborate on what you mean by that, and whether in fact and how it’s possible now?
CAROLE CADWALLADR: So, one of the people who are most horrified about what the internet has become, about the what the World Wide Web has become, is Tim Berners-Lee, the man who invented it. And the thing is, is that this was a great, glorious, democratizing invention that enabled people to — you know, to create things without the normal gatekeepers, to do all sorts, and for different people to come together.
So, Wikipedia is the great example of this. It was such an inspiring, sort of incredible project by Jimmy Wales, in which it allowed — it shouldn’t be possible that just random people can write an encyclopedia. Like, how would that even work? But yet, it is actually very robust and rigorous and has, you know, very high-quality information in there — now, obviously, under attack from Trump. And that’s — and Wikipedia never became a Silicon Valley corporate in the same way. It’s a nonprofit. It’s a foundation. It’s for the human good. It’s for the public good.
And, you know, the Silicon — this is capitalism. I mean, it’s this particularly predatory capitalistic model that has given us these companies. And then, it’s capitalism which hasn’t been, is not being subject to the laws which grew up in America to ensure that these capitalists were kept within checks and balances, rule of law. So, it’s — is that too complicated? Does that make — yeah, no, no.
I mean, and so there is another way, and there’s beautiful alternatives. So, one of the alternatives is WhatsApp is owned by Mark Zuckerberg. It’s part of Meta. It’s part of the global — you know, it’s adding to his billions. But there is Signal, which has exactly the same functionalities as WhatsApp but is owned by a foundation. And it has privacy protections built into the entire system. So —
NERMEEN SHAIKH: But that’s what WhatsApp was supposed to be initially, too, because it wasn’t — it wasn’t part of Mark Zuckerberg’s empire.
CAROLE CADWALLADR: I know. I know. I know. But so that’s something you can do, is that, you know, I encourage — everybody who I communicate with, I encourage them to be on Signal. And I enforce it in lots of the work things that I do. You know, I’ve still got — I mean, I should make a bit — this is interesting, because I should make more of an effort to get the rest of my family, if they’re still on WhatsApp, off it. But there are — you know, that’s a small thing in your daily life that you can do.
But we know there are great innovations. Bluesky was another example of that, which is that, you know, there’s a mission out there now by some people I know who are trying to buy Bluesky to put it in a foundation so the same thing doesn’t happen, because what we see — what we saw happen with Twitter, you know, which is it was this great tool for a long time, and it was amazing, and activists used it, and journalists used it, but then it was bought by Elon, and it was turned into what is now a global propaganda tool. This is a process which all these different platforms go through.
There is another way. And that’s where we, as citizens, as a community, can come together and create a better information environment. And the first step in that is understanding why we need it. So, the job that you guys are doing is so vital in this, because you’re bringing this understanding now to your audience, who, you know, are very sophisticated in understanding other forms of power, of state power, of military power, now have to understand and pay as much attention to this one.
AMY GOODMAN: We want to thank you so much for being with us, Carole Cadwalladr, award-winning investigative journalist. Her Substack, How to Survive the Broligarchy. She gained international recognition for her exposé of the Facebook-Cambridge Analytica data scandal years ago. At the time, she worked for The Guardian, was later brought to trial in a defamation suit brought by a billionaire businessman and political donor Arron Banks, best known for his role as co-founder of the 2016 Brexit campaign called Leave.EU. Part of the lawsuit’s claims stem from Carole’s 2019 TED Talk, “Facebook’s role in Brexit — and the threat to democracy,” and a Twitter post linking to the TED Talk. Carole Cadwalladr now runs the Substack, again, How to Survive the Broligarchy, and is starting a new news organization with a group of women. Is that right?
CAROLE CADWALLADR: That’s right. And there’s also — I have another. There’s a nonprofit which I run called The Citizens, which is very much about building a community of people who — to stand up to these tech platforms. So, they’re also on Substack at Citizens Reunited.
AMY GOODMAN: Well, thank you so much for spending this time. And to see Part 1 of our conversation, go to democracynow.org. I’m Amy Goodman, with Nermeen Shaikh. Thanks so much for joining us.
Media Options