Following days of silence about the political data firm Cambridge Analytica’s alleged misuse of 50 million Facebook users’ data, CEO Mark Zuckerberg is finally speaking out.
On Wednesday, Zuckerberg took to Facebook to acknowledge what has become a deeply damaging scandal surrounding the company’s ability, or lack thereof, to protect its users’ data. He also announced changes to the platform designed to protect users’ data. The changes come in the wake of reports by The New York Times along with The Guardian and The Observer, alleging that Cambridge Analytica, a vendor to President Trump’s 2016 campaign, and Cambridge’s British counterpart SCL had harvested a trove of Facebook user data from a University of Cambridge researcher, and may have kept it, despite promises to Facebook that all of the data they received from the researcher had been deleted in 2015. By Tuesday, Facebook’s market value had dropped by nearly $50 billion, as members of Congress step up their calls on Zuckerberg to come to Washington to testify.
Zuckerberg didn’t mention in his Facebook post why it took him five days to respond to the scandal, but he accepted responsibility for what he called “a breach of trust between Facebook and the people who share their data with us and expect us to protect it.”
“I started Facebook, and at the end of the day I’m responsible for what happens on our platform,” Zuckerberg wrote. “I’m serious about doing what it takes to protect our community. While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn’t change what happened in the past.”
“I started Facebook, and at the end of the day I’m responsible for what happens on our platform,” Zuckerberg wrote.
Zuckerberg announced that Facebook plans to audit any apps that were able to access large amounts of information the way the University of Cambridge researcher did. Facebook will ban apps that don’t agree to an audit. “If we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps,” Zuckerberg wrote, adding that that includes the people whose data was leaked to Cambridge Analytica. It’s unclear what level of detail people whose data has been abused will receive and exactly how Facebook will alert them. Facebook’s tool that tells users whether they’ve followed a Russian troll was well hidden.
Facebook also announced plans to introduce new restrictions on user data. App developers, for instance, will no longer be able to access data on users who haven’t used their apps in the last three months. The company will release only users’ names, profile photos, and email addresses, to app developers when they sign up. To access posts or other data, developers will have to get further approval from users and also sign a contract. Finally, Facebook is making its tool that allows users to see which apps can access their data more visible, by moving it to the Facebook homepage.
The crisis over Cambridge Analytica’s alleged misdeeds snuck up on Facebook’s leadership, though the company has made several last minute efforts at damage control. On Friday night, hours before the news broke, Facebook attempted to preempt the bad press by announcing it had suspended Cambridge Analytica and SCL from the platform. It also suspended the accounts of Chris Wylie, the former Cambridge employee who blew the whistle on the companies, and Aleksandr Kogan, the University of Cambridge researcher who provided the data to Cambridge and SCL.
In 2014, SCL commissioned Kogan to conduct what it called a “large scale research project” to psychologically profile Americans. Kogan did this by creating an app that offered users personality quizzes. Some 270,000 people downloaded the app, thereby handing over their own data to Kogan and his client, SCL. But due to a feature called the Social Graph API, which Facebook offered to app developers at the time, Kogan was also able to collect granular data about everyone those users were friends with on Facebook, as well, totaling 50 million users altogether. This was a feature widely used by app developers at the time, including the team behind President Obama’s 2012 campaign. Facebook only completely shut down this capability to access data from users’ friend networks in mid-2015.
The problem is the data Kogan collected was supposed to be Kogan’s alone. In passing that data on to SCL and Cambridge, he violated Facebook’s terms. What’s more, the companies may not have adhered to Facebook’s demands to delete the data. Multiple sources confirmed to WIRED that the data was visible to select group of Cambridge employees as recently as early 2017. Both Cambridge and SCL deny these claims, and maintain they deleted the data as soon as Facebook brought it to their attention. “Cambridge Analytica and SCL Elections do not use or hold Facebook data,” a joint statement from the companies read.
The groundswell of outrage and attention following these revelations has been greater than anything Facebook predicted—or has experienced in its long history of data privacy scandals. By Monday, its stock price nosedived. On Tuesday, Facebook shareholders filed a lawsuit against the company in San Francisco, alleging that Facebook made “materially false and misleading statements” that led to significant losses this week.
Meanwhile, in Washington, a bipartisan group of senators called on Zuckerberg to testify before the Senate Judiciary Committee. And the Federal Trade Commission also opened an investigation into whether Facebook had violated a 2011 consent decree, which required the company to notify users when their data was obtained by unauthorized sources. Members of Congress seemed less than satisfied with Zuckerberg’s post. In a tweet immediately after, US senator Ed Markey (D-Massachusetts) wrote, “You need to come to Congress and testify to this under oath.” Zuckerberg’s statement did not address whether he plans to testify.
Cambridge Analytica and SCL weren’t just any unauthorized sources, either. A series of undercover videos filmed by the British news network Channel 4 News showed Cambridge’s leaders bragging about a variety of dirty tactics they use on behalf of their clients, from spreading fake news to using women to entrap politicians. The executives, including then-CEO Alexander Nix, chief of data Alexander Tayler, and managing director of SCL elections Mark Turnbull, believed they were speaking with a fixer for a wealthy benefactor who wanted to influence Sri Lankan elections. The fixer was actually an undercover reporter.
Cambridge Analytica has released a statement saying the videos are selectively edited and misleading. Nix, who makes some of the most damning claims in the video, said he was merely humoring what he thought was a potential client. “I am aware how this looks, but it is simply not the case,” Nix said in a statement Monday. “I must emphatically state that Cambridge Analytica does not condone or engage in entrapment, bribes or so-called ‘honeytraps,’ … nor does it use untrue material for any purpose.”
A day later, Cambridge Analytica’s board suspended Nix as CEO, putting Tayler in his place, pending further investigation.
As these troubling details came to light, the scandal became about something greater than Facebook’s failure to secure user data; it was about how that data may have been misused by a company that openly brags about unethical, and in some cases illegal, behavior. By Wednesday, Zuckerberg’s silence had grown deafening.
To say this is all messier than Facebook initially knew would be a drastic understatement. Cambridge Anaytica is merely an imperfect poster child for more fundamental problems at Facebook—and in the tech industry at large. Yes, Facebook failed to alert its users about the unauthorized access of their data by Cambridge and SCL. But Facebook itself only learned about that access because Cambridge Analytica happened to be a high profile vendor for the Ted Cruz campaign, and The Guardian wrote about it in December 2015.
The far more pressing problem is that Facebook had no way of knowing how many other times user data was abused and mishandled in the past—now, at least, it’s trying to find out.