In Part 2 of our conversation with Shoshana Zuboff, author of “The Age of Surveillance Capitalism,” we discuss the Facebook Oversight Board, the role of private data brokers selling private information collected on internet users to the U.S. government and others, and why she has argued, “We can have democracy, or we can have a surveillance society, but we cannot have both.”
AMY GOODMAN: This is Democracy Now!, democracynow.org, The Quarantine Report. I’m Amy Goodman, with Nermeen. Shaikh, as we bring you Part 2 of our conversation with Shoshana Zuboff, author of The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power.
You recently wrote a piece for The New York Times titled “The Coup We Are Not Talking About.” You write, “We can have democracy, or we can have a surveillance society, but we cannot have both.”
In Part 1 of our discussion, Professor Zuboff, we talked about Facebook’s Oversight Board, appointed by Facebook, to decide whether former President Trump can come back on Facebook. They said, “We’re giving Facebook another six months,” but they upheld the suspension for now. Now I want to go to this very point that you and others have raised. Again, you write, “We can have democracy, or we can have a surveillance society, but we cannot have both.” Explain.
SHOSHANA ZUBOFF: Well, as I began to discuss before, these internet companies, that, you know, began as little startups, with really smart folks running them, learned early on that the way they could finally monetize their very interesting new services was to secretly extract behavioral data from human behavior from anywhere they could find it online. And later, you know, they gave us little pocket computers called smartphones, so we could go out into the world, and that way, they could get a lot more data.
And what they learned was that by extracting predictive information from all this data that they could mine from, from our activity, they could analyze it, they could predict our future behavior. That became the product that they could sell to advertisers to target who is most likely to click on their ads and click through to the product or the service and whatever. That was the foundation of targeted online advertising.
Now, this whole process of, you know, unilaterally invading our experience, turning it into data, extraordinarily, then claiming those data as their private property for manufacture and for sales, this has turned into a dominant economic paradigm. It’s not only the tech behemoths. It’s the whole tech sector. It’s every single app. Every app now is designed to maximize data collection from you, from your activity, from your behavior. And, of course, it cuts across every industry now.
So, what’s happened is, in these last 20 years, Amy, nobody wants to admit it, but the fact is what we have seen is the wholesale destruction of privacy in the last 20 years. There is no more privacy. And concomitant to that, you know, if you look at a company like Google or you look at a company like Facebook, these folks have accrued extraordinary power. A man like Mr. Zuckerberg, who has absolute control over Facebook, now essentially has absolute control over critical communications infrastructure that involves billions of people. And there is no countervailing force, no countervailing power, when it comes to Mr. Zuckerberg and his decision rights, just as was the case, you know, when we were fighting John D. Rockefeller more than a century ago.
So, we are in a crazy situation. And when you ask the question, “How did we get here? How did these companies — you know, how did we allow them to develop so much power over the central aspects of our society?” the answer is that democracy did nothing. Democracy failed to intervene. Democracy was essentially either feeding its own need for human-generated data, that it couldn’t collect under the Constitution and under the public laws of our country, so it relied on these companies to collect it instead, and, you know, in other respects, our lawmakers were asleep at the switch and allowed this power to grow unaccountably, totalistically, before they realized the threat it was to their own power, because the long game here for surveillance capitalism is to substitute computational governance, governance by algorithms that they control for the sake of their commercial objectives, to substitute that for democratic governance. That is the long game. And there is evidence of that everywhere we look.
In the meantime, this informational chaos that we have been experiencing, misinformation, disinformation, the absence of a public sphere online, because everything online now, all communications online, are essentially owned and operated by surveillance capitalism. And Facebook is at the center of that, because it’s the largest social media space. So, we have this communication space that is no longer a public sphere. A public sphere is the essential context when we talk about free speech. The idea that, you know, Justice Holmes and others have elaborated over the last 100 years is that, you know, free speech is so essential, because in a public sphere, when you have free speech, the best ideas naturally rise to the top. The best ideas naturally get circulated to the most people, and therefore adopted by the most people. So, it’s a kind of good version of survival of the fittest, you know, a positive version of that image.
But the fact is that without a public sphere, we’re stuck inside Facebook. Facebook is an economic machine. And as I mentioned before, it has to get the maximum amount of behavioral data to create great predictions to have lucrative targeting to sell. How does it do that? Well, to get the maximum amount of data you’ve got to get the maximum amount of extraction. And a few years ago, experts, academic researchers, understood that false, inflammatory information circulates more widely and is more widely engaged with than true, good, normal, commonsense information. It’s a little bit like you’re, you know, driving down the highway. You don’t stop to look at a beautiful tree; you stop to look at a horrible accident and back up traffic for 20 miles behind you. So, this is how we’re wired. Let’s just say that for now.
What Facebook learned to do — and others do it, too — is to engineer their machine operations to take advantage of this. So, the worst content, the most inflammatory, destructive, dangerous, false, bizarre, crazy content, the kind of content that in a public sphere it would arise at the fringe and it would stay at the fringe, in a normal society — it’s crazy stuff. There are always going to be people spouting crazy stuff, but they don’t — their ideas don’t rise to the top. They stay at the margins where they belong. You don’t need censorship to achieve that. You just need proper, normal, healthy social dynamics. But inside Facebook, we don’t have that. We have this economic machine that is artificially selecting for amplification and allowing it to take over the center of this private square. And therefore, the craziest stuff overwhelms the good stuff. We have seen this most tragically — we’re all focused on, you know, January 6 and the tragedies and outrageous challenge to democracy and the peaceful transition of power on that day, as well as the horrible death and destruction.
But honestly, Amy, we have to look at the larger picture here, especially COVID disinformation. Columbia did a study. It came out late in the summer of 2020, when 217,000 American daughters and sons had died of COVID. And the question they asked was: How many of those deaths were avoidable? And the answer they came up with, through a very stringent analysis, was between 130,000 and 210,000 of those deaths could have been avoided, Amy. They were unnecessary. This is mass murder.
So, now we say, “Well, what made them unnecessary?” They did an analysis, and they came up with four key reasons for these avoidable deaths. And each of those reasons is related back to disinformation — the discrediting of public health authorities, the discrediting of wearing masks and so forth — four key reasons that produced behaviors that caused all of these avoidable deaths.
Now, let’s put that next to Pew research that came out of Cornell around the same time. And without going into all the details, the bottom line is that Cornell researchers determined that 40% of COVID disinformation that was circulating was produced by one dangerous influential, and his name was Donald Trump. All right? So, the idea that Mr. Trump should have free access to the global information bloodstream, on Facebook or anywhere else, is an abhorrent, obscene thought. The man has showed his complete inhuman disinterest in life itself, as well as in the very functioning of our democracy, a democracy whose Constitution he swore to uphold. So, president or no, does he deserve to be at the center of the global information bloodstream? Does he deserve to have the amenities of Facebook’s economic machine, that automatically takes his most bizarre, disfigured, disgusting posts and amplifies them to the world, driving extremism and polarization, not only in the United States, but in every country? No, nobody deserves that, given the kinds of, you know, crimes against information and crimes against humanity that are evident here in Mr. Trump’s callous and inhuman behavior. So —
NERMEEN SHAIKH: Well, Shoshana, it wasn’t —
SHOSHANA ZUBOFF: — this is not a free speech issue. Yes. Please.
NERMEEN SHAIKH: But, you know, Shoshana, it’s not just — as you well know, it’s not just Trump and the disinformation campaign around COVID in the U.S. There was disinformation spread in many different languages all over the world via social media platforms like Facebook. But then Facebook pledged to crack down on this disinformation. What exactly happened, if anything?
SHOSHANA ZUBOFF: Well, we know that Facebook’s version of cracking down is not what you and I would think of as cracking down. Let me tell you what cracking down is. A few months ago, the Australian government was implementing a new code that would have required companies like Facebook and Google, that essentially stole content from the newspaper industry in order to publish it in their spaces, and that hastened the demise of the newspaper industry as we knew it — and these companies, you know, never paid anything for it; on the contrary, they got rich on it. So, here comes Australia saying, “Hey, we’re finally going to make you pay for what you stole 15 years ago.”
Now, Google threatened to shut down search in Australia. But instead of making good on that threat, they did a backstage deal with Rupert Murdoch and his news empire — of course, based in Australia — and found a way forward, that way so that the code would not apply to Google. But what did Facebook do? You want to know how Facebook acts when it really wants to? Facebook simply shut down its pages in Australia that had news links. Rather than negotiate with the Australian government, rather than face having to pay for using those news links and content, they simply vanished their pages.
They could have done that with COVID disinformation, because it was killing people. But they did not. Here’s what they did. They put warnings on it. They put labels on it. And there’s research, including internal Facebook research, showing that the labels meant nothing.
Then, again, late summer 2020, Avaaz comes along with a very substantial study. They looked at 82 websites that had the worst COVID disinformation. And those websites were getting massive engagement. Then they selected the top 10 of those websites, the ones with the most views, the biggest websites with the most disinformation. And they looked at that exclusively for April 2020. And what they saw was that those top 10 bad, toxic, disfigured, crazy websites received 300 million views in April 2020, compared to the top 10 public health websites, like the CDC and the WHO, which received 70 million views that same month.
So, when you — and then they dug down another level, and they looked at what was getting those 300 million engagements. And many of those posts that were getting those engagements already had Facebook warnings on them. And another major subset had no warnings on them, even though the fact-checkers had already alerted Facebook to the fact that this was false information, this was disinformation. Turns out only about 16% of the disinformation that was fact-checked actually got a label on it, and then the labels were ignored. So —
AMY GOODMAN: Professor —
SHOSHANA ZUBOFF: — when Facebook wants to act, when Google wants to act, they can act. They can shut it down. They can pull the plug. They can make sure that information never even gets there. They don’t have to shut it down after the fact, after the horses are out of the barn. They can shut it down before it even gets there.
And you know who forced them to do that? The prime minister of New Zealand. After the horrible Christchurch tragedy, she insisted that they shut down the videos of that horrible tragedy before they got posted. And in New Zealand, they did. We know they can choke off content before it enters the global information bloodstream, if they want to.
But again, they have an economic machine to feed, and they had political appeasement. Always the sword of Damocles hanging over Zuckerberg’s head, and he made a devil’s pact, early on, that he was going to appease Trump. And when you understand that Trump was responsible for 40% of this disinformation, because Trump made a calculation right at the beginning of the pandemic, as I’m sure you know — he made a calculation right then that he could either try to actually govern America and fight the pandemic or he could simply say that the pandemic was over, or was going to be over, or that the vaccines were coming any day now, or that we’d all be back in church on Easter, or whatever, and make a lot of people believe that, which was a lot easier than actually making it true.
AMY GOODMAN: So, Professor Zuboff, I wanted —
SHOSHANA ZUBOFF: And so —
AMY GOODMAN: I wanted to ask you more about the role of private data brokers selling private information collected on internet users to the U.S. government and others. This is Democratic Senator Ron Wyden of Oregon, who sponsored the Fourth Amendment Is Not for Sale Act.
SEN. RON WYDEN: There ought to be transparency so that the American people know what kind of surveillance is being conducted on them. The president of the Senate knows about the important vote we had on that amendment that I offered, the bipartisan amendment with Senator Daines, because we ought to get history on whether the government is spying on the browsing history of the American people. So, this is really a critical and growing concern, because we’re all seeing data brokers and others selling people’s data. And it’s especially important that the American people are told if the government is using legal loopholes in the law and the warrant requirement of the Fourth Amendment. So I asked Ms. Haines about circumstances in which the government, instead of getting an order, just goes out and purchases the private records of Americans from these sleazy and unregulated commercial data brokers, who are simply above the law.
AMY GOODMAN: So, that was Senator Ron Wyden in January. CNN recently reported the Biden administration is considering working with private firms to conduct domestic surveillance. The proposal by the Department of Homeland Security would allow it to circumvent laws regulating domestic surveillance by doing it not through the government, but through these private companies. As we begin to wrap up, Professor Zuboff, if you could talk about the significance of this and how surveillance capitalism could be most effectively challenged?
SHOSHANA ZUBOFF: Well, you’re absolutely right to play that bite, that Wyden clip. You know, Senator Wyden is correct. The problem is that what he’s saying is 20 years too late, because these companies — and I’m talking — beginning with Google, Google’s model, right from the start, was surveillance. The only way they could get the behavioral data that they needed was to do it secretly; otherwise, people would rebel, and that would put friction into the system, and they couldn’t afford friction. So, from the start, these were surveillance systems.
But they were encouraged by the United States government in that post-9/11 environment. The NSA wanted to do surveillance that it couldn’t do because of American laws, and instead encouraged the surveillance capabilities of these young companies, these young internet companies sprouting up in Silicon Valley, because they knew that as long as the data was collected in their servers, then they in Washington, they in the intelligence agencies, would have a source.
And to tell you just how deeply institutionalized this is, in 2013, the chief technology officer of the CIA spoke at a public conference, open to — a conference open to the public, mostly attended by tech nerds. And what he said at that conference was, quote, “We have to connect the dots. Since you can’t connect dots you don’t have, it drives us into a mode of fundamentally trying to collect everything and hang on to it forever. We take advantage of the massive information streams that have emerged on the planet. And we thank the internet companies, especially Google, Facebook, YouTube, Twitter, Fitbit and the telecoms for making it possible.”
So, this has been the status quo from the start. And this is one of the most pernicious legacies of the war on terror. I call it surveillance exceptionalism. This was another kind of pact with the devil that our government made, you know, in this moment of extreme pressure for total information awareness.
Now fast-forward to our time. Fast-forward to the third decade of the 21st century. We’ve learned a lot. We’ve learned that these kinds of surveillance systems, even under private capital, are on a collision course with democracy. We know that they undermine human autonomy. They take our decision rights. They have completely destroyed privacy. They’ve blown by the Fourth Amendment. Forget about that. And at the same time, they’re changing democracy from the top down, because, as we’ve seen in the situation of Facebook, these companies now have huge asymmetries of knowledge. The difference between what we can know and what can be known about us is now an abyss. This is a whole new dimension of social inequality. And with that asymmetry of knowledge comes asymmetry of power, what they can do to us. And that comes in with the manipulation and behavior modification, that is called targeting.
So, we’re in a situation now where the public is onto this. If you look at the survey data, we’re experiencing now a complete rupture of faith in the American public toward these companies and toward these sectors. We are seeing, finally, our lawmakers beginning to get the bit in their teeth, beginning to be willing to name this economic model and its roots in surveillance — surveillance economics, surveillance capitalism, with a very specific mode of operation and extremely pernicious consequences for people and democracy. In those congressional hearings that we saw on March 24th of this year, for the first time we saw leaders of Congress naming the economic model and naming the unaccountable power that the tech executives have accrued from these economics of surveillance, and saying, “This is no longer going to stand. We are going to curb this unaccountable power that you are exercising over the American people and, indeed, over the people of the world.” So, I believe we are at a crossroads. And all the elements are here for a turning point.
The way we stop surveillance capitalism, to answer your final point there, is that we have to go after the supply and the demand. And by that, I mean we have to challenge the original sin here, which is the secret, hidden, surveillance-based invasion of our personal space, our personal behavior, our personal experience, lifting out information that is then converted into behavioral data, which they then claim as their private property and then turn it into their operations for manufacture and sales. That extraction, not only is it a parasitic form of economics — it doesn’t create jobs, it doesn’t create prosperity, it doesn’t do all the things that we expect from a healthy capitalism — but it is, as we’ve said, undermining democracy at its roots and in its very structure.
So, the second thing that we do — we go after extraction. That’s supply. And then we go after demand. Because where are these predictions of our behavior sold? Well, as I mentioned, it started in these online targeted advertising markets. What are these markets, really? When you just think about it for a minute, what you realize is that these are markets trading in commodity futures of human behavior. These are human futures markets, that whether it’s insurance companies or retailers or finance companies or health companies or retail companies or the advertising industry, they are all in there bidding on predictions of what we will do soon and later, because that turns out to be extremely lucrative information. And the higher they bid, the greater the pressure on these companies to expand and deepen their extraction operations. So, every app is an extraction tool, every so-called smart device, every so-called personalized service. And we’re living now in this whole, you know, pervasive digital architecture of invasion and extraction.
So, we go after those markets, and we simply make them illegal. Why not? We’ve made markets that trade in human beings illegal. We’ve made markets that trade in human organs illegal, because they have predictably dangerous consequences for people in society. We can do the same thing with markets that trade in human futures, because they drive the surveillance society. And the surveillance society is, day by day, suffocating the very possibility of democracy and the very capabilities within each individual citizen that we rely on to even imagine a healthy, functioning, flourishing democracy,
AMY GOODMAN: Shoshana Zuboff, we want to thank you so much for being with us, professor emerita at Harvard Business School, author of the book The Age of Surveillance Capitalism: The Fight for a Human Future and the New Frontier of Power. To see Part 1 of our discussion, you can go to democracynow.org. I’m Amy Goodman, with Nermeen Shaikh. Thanks so much for joining us.