You turn to us for voices you won't hear anywhere else.

Sign up for Democracy Now!'s Daily Digest to get our latest headlines and stories delivered to your inbox every day.

“The Great Hack”: Big Data Firms Helped Sway the 2016 Election. Could It Happen Again in 2020?

Listen
Media Options
Listen

Image Credit: Netflix

The documentary “The Great Hack,” which was shortlisted for the Oscars, explores how the data firm Cambridge Analytica came to symbolize the dark side of social media in the wake of the 2016 U.S. presidential election. Cambridge Analytica collapsed in May 2018 after The Observer newspaper revealed the company had harvested some 87 million Facebook profiles without the users’ knowledge or consent. Cambridge Analytica then used the data to sway voters to support President Trump during the 2016 campaign. We speak with “The Great Hack” co-directors Karim Amer and Jehane Noujaim, as well as Cambridge Analytica whistleblower Brittany Kaiser.

Transcript
This is a rush transcript. Copy may not be in its final form.

AMY GOODMAN: This is Democracy Now!, democracynow.org, The War and Peace Report. I’m Amy Goodman. We’re spending the hour looking at Cambridge Analytica, the British company that was co-founded by the well-known President Trump special adviser Steve Bannon. He was vice president of Cambridge Analytica, which came out of a military contracting group and now is accused of having been involved, essentially, with PSYOPs, using tens of millions of people’s information that it harvested from Facebook.

Our guests are the award-winning filmmakers Karim Amer and Jehane Noujaim, who are directors of The Great Hack, which has just been nominated for a BAFTA, the Oscar equivalent in Britain, and has been shortlisted for an Oscar, and Brittany Kaiser, a whistleblower who worked at Cambridge Analytica for more than three years. In Washington, we’re joined by Emma Briant, who has been looking at Cambridge Analytica for years.

Let me ask you, Jehane, why you decided to spend years of your life looking at Cambridge Analytica and making this film, The Great Hack, and why you called it that.

JEHANE NOUJAIM: Well, this has, for me, been a film, I feel, 20 years in the making. Twenty years ago, I made a film with Chris Hegedus and D.A. Pennebaker called Startup.com, which was about the beginning of the dotcom world starting and where people wanted to be God online and start internet companies and make millions and millions of dollars. And fast-forward 20 years, and they are God online.

But I’ve always been obsessed with how we get our information. And soon after Startup.com, I made Control Room, which was about looking at the Iraq War, and depending on whether you were looking at Al Jazeera or Fox News or CNN, you had a completely different understanding of reality on the ground. And I had this question of how — if you don’t have some kind of shared understanding of truth, how can there be nuanced conversation and discussion, which is what’s necessary for a democracy to function?

Flash-forward to us making The Square, and at that time social media was a tool for change, positive change. I mean —

AMY GOODMAN: You’re talking Tahrir Square in Egypt.

JEHANE NOUJAIM: Tahrir Square. This is where we met, and we made the film, The Square. And at that time, even when I was arrested, Twitter was used to find me. So, it was a very positive tool for change.

Then we see the pendulums kind of swing in the other direction, and we see that social media can be used in a very different way. And it was used by the Army, and then we started to see it being used in the Brexit campaign and the Trump campaign. And we started hear this word “hack” and the hacking of these elections. But what we realized was that the real hack that we needed to look at was the hack of the mind and what is happening inside our newsfeeds and what happens when people are creating their own truths because they’re being microtargeted.

And we started to look at this company Cambridge Analytica. And what we found there was fascinating, because we realized that there was this invisible story that’s happening inside our computer screens, inside our heads, which is leading to everybody having a completely different understanding of reality based on their newsfeeds.

And that’s when we met Brittany Kaiser, who at the time was just about to become a whistleblower and come out about what she knew. She was basically saying, “I’m going to Thailand, and if you want to talk to me, then come meet me here. I can’t tell you exactly where it’s going to be, where I’m going, but land at this airport.” We met her there, and that’s where the film began.

AMY GOODMAN: Well, I want to go to a clip of The Great Hack, your film. In this, Brittany Kaiser explains the concept of the persuadables.

BRITTANY KAISER: Remember those Facebook quizzes that we used to form personality models for all voters in the U.S.? The truth is, we didn’t target every American voter equally. The bulk of our resources went into targeting those whose minds we thought we could change. We called them the persuadables.

They’re everywhere in the country, but the persuadables that mattered were the ones in swing states like Michigan, Wisconsin, Pennsylvania and Florida. Now, each of these states were broken down by precinct. So you can say there are 22,000 persuadable voters in this precinct, and if we targeted enough persuadable people in the right precincts, then those states would turn red instead of blue.

AMY GOODMAN: So, that’s a clip from The Great Hack. Brittany Kaiser, explain further this idea of persuadables.

BRITTANY KAISER: So, you might have heard them referred to as “swing voters.” In brand advertising, they’re called “switchers,” because it’s easy to persuade someone to try something new or to change their mind. So, identifying persuadables is what everybody does in data science for political modeling. Every political consultant in the books is trying to do this, identify the people whose minds can be changed, because quite a lot of people have not made up their mind yet. And when you’re trying to introduce a character as controversial as Donald Trump, the idea was, find the people who could be convinced, even though they had probably never voted for anyone like him before.

AMY GOODMAN: So, talk about your trajectory. I mean, Karim and Jehane, you do this very well in the film, but it is a very unlikely path to a firm that may well have been illegal in what it did, in working with Facebook, harvesting all this information, that ultimately helped to get Trump elected. But that’s not really where you came from. In the film, I’m looking at pictures of you and Michelle Obama. You were a key figure in President Obama’s social media team in his election campaign.

BRITTANY KAISER: I have always been a political and human rights activist. That’s where I came from, so it was really easy to snap back into that kind of work. I actually was in the third year of my Ph.D., writing about prevention of genocide, war crimes and crimes against humanity, when I first met the former CEO of Cambridge Analytica, Alexander Nix. My Ph.D. ended up being about how you could get real-time information, so how you could use big data systems, in order to build early-warning systems to give people who make decisions, like the decision that was just made about Iran — give them real-time information so that they can prevent war before it happens. Unfortunately, no one at my law school could teach me anything about predictive algorithms, so I joined this company part-time in order to start to learn how these early-warning systems could possibly be built.

AMY GOODMAN: Well, explain. Explain your meeting with Alexander Nix, who is the head — came from the defense contractor — right? — SCL, and then was the head of Cambridge Analytica, who said, “Let me get you drunk and steal your secrets.”

BRITTANY KAISER: Yes, he did. Not that becoming, but he has always been an incredibly good salesman. In one of my first meetings with him, he showed me a contract that the company had with NATO in order to identify young people in the United Kingdom who were vulnerable to being recruited into ISIS, and running counterpropaganda communications to keep them at home safe with their families instead of sneaking themselves into Syria. So, obviously, that type of work was incredibly attractive to me. And I thought, “Hey, data can really be used for good and for human rights impact. This is something I really want to learn how to do.”

AMY GOODMAN: But soon you were on your way to the United States with Alexander Nix, meeting with Corey Lewandowski, who at the time was the campaign manager for Donald Trump. When did those red flags go up for you?

BRITTANY KAISER: There were red flags here and there, especially when I would call our lawyers, who were actually Giuliani’s firm at the time, in order to ask for advice on what I could and could not do with certain data projects. And I always got told, “Hey, you’re creating too many invoices.”

But what really landed the plane for me was, a month after Donald Trump’s election, everybody at Cambridge Analytica who had worked both on the Trump campaign and on the Trump super PAC, which ran the “Defeat Crooked Hillary” campaign — they gave us a two-day-long debrief, which I write about in detail in my book Targeted, about what they did. They showed us how much data they collected, how they modeled it, how they identified people as individuals that could be convinced not to vote, and the types of disinformation that they sent these people in order to change their minds. It was the most horrific two days of my life.

AMY GOODMAN: So what did you do after that?

BRITTANY KAISER: I spent a while trying to figure out if there was still anything I could salvage from what I learned there. Was it still possible to use these tools for good? And when I realized that the company had gone way too far in the wrong direction, I started working with journalists in order to go through and figure out what I had in my documents that could possibly assist in saving democracy in the future.

AMY GOODMAN: You testified before the British Parliament. You were subpoenaed by Robert Mueller. You’ve been involved in a lot of information giving during these investigations. In an odd way, would you describe yourself as a persuadable?

BRITTANY KAISER: Definitely. And that’s actually a story that is very prevalent in my book. Most people don’t like to think that they are persuadable. We all like to think that we can’t be manipulated. But, trust me, we’re not as digitally literate as we like to think that we are. That’s why I released the Hindsight Files, because I want everyone to realize how easy it is for us to be manipulated and that we need to be aware in order to protect ourselves.

AMY GOODMAN: Karim Amer, for people who are still sitting here and going, “Cambridge Analytica, Facebook, what does this have to do with each other, Zuckerberg testifying before Congress?” explain then what was the magic sauce. What happened here between these two companies? What did Cambridge Analytica do? And I’ll also ask Brittany this question. And what did Facebook understand was done? And what did they do about it?

KARIM AMER: Well, I think, you know, the situation that we find ourselves in is one in which all of our behavior, which is essentially what data is, recordable human behavior, is constantly being tracked and gathered. And that’s part of the deal or the devil’s bargain we’ve made in this new economy, where we give our data up, and in return we get services. Now, most of us go ahead and we sign these terms and conditions that we don’t really read or don’t really understand what they’re about. What we’ve realized is that we’re giving up a certain level of autonomy that we may not have understood the implications of.

Now, what is that autonomy? What that is, it is insight into everything you do all the time. It’s something that’s tracking you, from the most public of spaces, when you’re posting, to the most intimate of spaces, when you’re watching porn or when you’re messaging somebody or when you’re staring at photos of a loved one or someone else. All of that behavior is constantly being tracked.

And it is used to create essentially a voodoo doll of you that can predict your behavior with quite a lot of accuracy. The proof of that is that that is the business model of Facebook and Google. It is about predicting your behavior and selling access to that prediction. Think of it as being in a casino that is constantly running, trying to make bets on what you’re going to do next. Now, that casino access is being sold in real time to all kinds of brands around the world.

What Cambridge identified was that they could take voter data, and they could take personality data, and they could map them together and create the most accurate profiles. That’s why they bragged about having 5,000 data points about every U.S. voter, which was one of their unique offerings. And with that insight, they realized that if you knew which districts you had to target and how to target the key people in those districts with the perfect messaging, you had the greatest chance of success.

And where that leaves us to is, we used to live in a world where a political leader had to write one big, one great story to inspire a great people to go on to a great cause. Now we live in a world where a politician can customize a story to every single individual voter and do it in a way which is operating in darkness without transparency.

What do we mean by darkness? We mean that 'til this day we still do not know what ads were placed on Facebook in 2016, who was targeted, who paid for those ads, how it was conducted, were these ads paid for by a foreign country or not, what happened, what didn't happen. And I think we deserve to know. Why? Because we’ve seen that this has become a place of weaponized information, that can be used to not only promote amazing ideas, but to convince people not to vote, which is active voter suppression.

And what’s troubling is that this is an information crime. Whether it’s legal or illegal doesn’t matter, in my opinion, because many things in our country’s history were legal once before, including slavery, and yet we’ve realized that it was not OK for them to be legal. So, at the current moment, Facebook is a crime scene. Facebook has the answers. Facebook knows what happened to our democracy. And yet it is still unwilling to participate in giving us the evidence we need.

AMY GOODMAN: I want to turn to another one of your clips, Jehane and Karim, one of the clips of The Great Hack. This is one of the main subjects of the documentary, professor David Carroll.

DAVID CARROLL: I was teaching digital media and developing apps, so I knew that the data from our online activity wasn’t just evaporating. And as I dug deeper, I realized these digital traces of ourselves are being mined into a trillion-dollar-a-year industry. We are now the commodity. But we were so in love with the gift of this free connectivity that no one bothered to read the terms and conditions.

AMY GOODMAN: David Carroll is featured in this documentary, The Great Hack. In 2018, we spoke to David Carroll at Democracy Now!, associate professor of media design at Parsons School of Design. He’s filed a full disclosure claim against Cambridge Analytica in the U.K. I asked him what he was demanding. And a clip of this also appears in The Great Hack.

DAVID CARROLL: A full disclosure. So, where did they get our data? How did they process it? Who did they share it with? And do we have a right to opt out? So, the basic rights that I think a lot of people would like to have, and the basic questions that a lot of people are asking.

AMY GOODMAN: And we’re going to find out just what happened with these demands, these questions he had, when he took them to Britain to take on Cambridge Analytica, and how he, as well as Brittany Kaiser and others, took Cambridge Analytica down. This is Democracy Now! Stay with us.

The original content of this program is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Please attribute legal copies of this work to democracynow.org. Some of the work(s) that this program incorporates, however, may be separately licensed. For further information or additional permissions, contact us.

Next story from this daily show

Propaganda Machine: The Military Roots of Cambridge Analytica’s Psychological Manipulation of Voters

Non-commercial news needs your support

We rely on contributions from our viewers and listeners to do our work.
Please do your part today.
Make a donation
Top