Signing up for a Facebook account, or any free online service, comes with an implicit bargain: Use it as much as you want—check your News Feed, like a status, poke a friend—and in return, the company will collect your data, and use it to serve you ads both on Facebook and around the web. But what appears to be a simple exchange has become anything but.
This is not a screed about deleting your Facebook account—although if you want to, here’s how. It’s not a rant about online ads. It is an argument, though, that Facebook has been a poor steward of your data, asking more and more of you without giving you more in return—and often not even bothering to ask. It has repeatedly failed to keep up its side of the deal, and expressed precious little interest in making good.
By now you’ve likely heard of Cambridge Analytica, a company that provided data services to Donald Trump’s 2016 presidential campaign. More recently, it has entered the spotlight for having pilfered the data of 50 million Facebook users. Cambridge obtained that data from a researcher named Aleksandr Kogan, who developed a quiz app in 2013 that collected information from not only the 270,000 people that downloaded the app, but many of their friends, as well. When Kogan passed this information along to Cambridge, it was in violation of the social media company’s terms of service.
In 2014, Facebook cut off the third-party developer access that swept tens of millions of people in that particular net. But while the company says it discovered the incident in 2015, it took until this past weekend, after the publication of two deeply reported stories from The Guardian and The New York Times, for Facebook both to disclose it and to suspend Cambridge and Kogan from its platform.
‘They’re giving us free stuff, but I deserve to know what the bargain is.’
Nuala O’Connor, CDT
And even then, Facebook lacked transparency. In announcing the suspension, the company seemingly downplayed the scope of the issue by citing that 270,000 users that downloaded Kogan’s app had been affected. It added that “friends who had their privacy settings set to allow it” were also impacted, but failed to note just how significant that impact was. And rather than substantively engage with why Facebook had not offered better safeguards against Cambridge’s actions, the social media company’s executives partook on a semantic debate on Twitter over whether the incident counted as a “breach.”
In response to an inquiry from WIRED, Facebook pointed to the company’s Friday post about suspending Cambridge Analytica. “In 2014, after hearing feedback from the Facebook community, we made an update to ensure that each person decides what information they want to share about themselves, including their friend list,” wrote deputy general counsel Paul Grewal. “Before you decide to use an app, you can review the permissions the developer is requesting and choose which information to share. You can manage or revoke those permissions at any time.”
But the focus on third-party apps misses the larger issues at play. “Facebook continually pushes the envelope with regards to user privacy,” says Sam Lester, consumer privacy fellow at the Electronic Privacy Information Center. “They have data on almost every American, and they try to extract maximum value out of that data.”
Supporting evidence isn’t hard to come by; in fact, in 2011 the Federal Trade Commission imposed a legally binding consent decree against Facebook over its failure to keep its privacy promises, which critics argue has gone largely unenforced.
At the very least Facebook continues to push the boundaries of consent. In 2014, it implemented a controversial test in which it attempted to manipulate the emotions of its users through News Feed. In 2016, two years after acquiring WhatsApp, Facebook changed the encrypted chat app’s terms of service to reap the phone numbers and various analytics of users with accounts on both services, giving only a 30-day opt-out window. And if you approved its use of face-recognition technology five years ago, Facebook automatically applied that preference to a host of new face-recognition features it rolled out in December, only notifying users last month that they might want to check their settings.
Beyond those more public flare-ups, though, a certain opacity permeates Facebook’s offerings, and many of the company’s users have no idea the extent to which their information is used. “Transparency is fundamental,” says Nuala O’Connor, president and CEO of the Center for Democracy & Technology, a nonprofit focused on online civil liberties. “I don’t want to know the ones and zeroes of every algorithm. That’s an unreasonable burden on the individual. But I do deserve to know the outcome of this data. They’re giving us free stuff, but I deserve to know what the bargain is.”
Facebook didn’t invent data collection, and is far from the only company that partakes. It has to pay the bills somehow. And in a post Monday afternoon, Facebook VP Adam Bosworth maintained that maintaining your privacy is in the company’s best interests. “Yes developers can receive data that helps them provide better experiences to people, but we don’t make money from that directly and have set this up in a way so that no one’s personal information is sold to businesses,” Bosworth wrote. “If people aren’t having a positive experience connecting with businesses and apps then it all breaks down. This is specifically what I mean when we say our interests are aligned with users when it comes to protecting data.”
One minute you’re filling out an app survey; the next, your answers are informing the psychographically targeted ads of a political campaign. No one signed up for that.
“As with Google, Facebook’s business has been built on data and the ‘contract’ they’ve struck with their users for access to all of their data in exchange for offering free services,” says Jason Kint, CEO of Digital Content Next, a media trade association that focuses in part on improving online ads. “This model has clearly eroded consumer and publisher trust as both have become more wise to the dangers of leaving it to Facebook to use the data how they see fit.”
A Bad Deal
For many, Facebook and its products—Messenger, WhatsApp, Instagram—serve as a utility. (In some countries, Facebook’s Free Basics program effectively means it is the internet.) But it’s worth asking: What has it given you in return?
‘They have data on almost every American, and they try to extract maximum value out of that data.’
Sam Lester, EPIC
Your mileage here will vary. But as Facebook collects more and more data, and offers advertisers more and more tools to monetize it, the benefit to you seems not to have grown in kind. You get an ever-shifting algorithm designed to keep you scrolling, which the company’s own research suggests can leave you “feeling worse afterward.” You get dozens of Russian propagandists flooding millions of News Feeds with high-emotion content designed to undermine US democracy, with slow and incomplete disclosures about the impact. And you get ads for the same pair of shoes—that you already bought—trailing you for months.
In fairness, Facebook does regularly remind users to check their privacy settings. (If you haven’t done so in a while, take the time to now!) And you don’t like it, you can always just delete your account. But privacy advocates say it’s unfair to put the onus on the user.
“As consumers, social media is the way we interact with public life,” says EPIC’s Lester. It’s frankly not a reasonable bargain to say that users either need to withdraw from the internet or spend half of their life scouring through privacy policies that nobody understands, that are hidden in ways that are almost impossible to find.”
Asking what Facebook user can do to protect themselves, Lester says, as akin to asking what drivers could do to protect themselves in a car before seat belts became standard.
The good news is, some version of a data privacy seat belt may be in the offing. The European Union’s General Data Protection Regulation will require transparency from companies about what kind of data they collect, and how it will be used. And while no such law seems imminent stateside, the Attorney General of Massachusetts announced an investigation into Facebook and Cambridge Analytica that could at least shed more light on what took place. Senator Ron Wyden Monday followed up with a detailed series of questions for Facebook to answer.
In the meantime, Facebook users need to ask themselves very seriously exactly what kind of bargain they’ve struck—and how long they’re willing to put up with Facebook changing the terms.