For those paying attention to privacy scandals and data leaks over the last few years, Facebook has emerged as a leading culprit. From its 20-plus scandals of 2018 (yes, someone actually counted) to an actual spyware app the company paid users to download to downright disastrous security practices exposed earlier this year, the company can’t seem to get much right.
Which is why the recent post (which the Washington Post also ran) in which Facebook CEO Mark Zuckerberg proposes four new ideas to regulate the internet should come as no surprise. That Zuckerberg might fall short, yet again, in offering meaningful changes in the way his company collects, stores, or analyzes its users’ data is to be expected. Broadly speaking, Zuckerberg’s latest proposals are mostly superficial when compared to the scale and the scope of the problems Facebook is confronting.
But if there is one occurrence as consistent as Facebook’s privacy shortcomings, it may just be the company’s public insistence that it will right the ship — that after all the letdowns, Facebook has finally started to make amends. And it is perhaps this act of ritual insistence on change that makes Zuckerberg’s latest attempt worthy of dissection.
So what does Zuckerberg propose?
To start with, he asks that governments clarify what counts as harmful content online, so Facebook can better take that content down. Then Zuckerberg requests that laws governing political advertising, which are often focused narrowly on elections, be expanded to general political interference.
Third is a proposal to standardize privacy regulations across the world, making it clearer (and therefore easier) for companies like Facebook to apply the stringent standards of European regulations like the General Data Protection Regulation to all users.
Last is the appeal to what’s called “data portability,” which would make it easier for users to move their data from one service to another — similar to how cell phone carriers in the United States allow users to keep their phone numbers when they switch services.
So what’s the problem?
To start with, a good portion of these proposals are already occurring in practice or will soon be mandated by regulators in large parts of the world. The GDPR, for example, already requires data portability in the EU, the second largest economy in the world. What’s more, major regulations in top economies such as Germany, China, and, most recently, Australia are already forcing tech giants to increase their investments in taking down content that might be harmful. Much of what Zuckerberg is proposing is, in short, already under way in one form or another.
More broadly, if Facebook has made significant mistakes in the past with its users’ data, you wouldn’t know the depth of these missteps by Zuckerberg. And yet a key aspect of atonement is sacrifice — a demonstration that one is willing to forego some future benefit to make up for past sins. These proposals contain none of that. Much of this might, in fact, arguably help Facebook in the long run.
What’s so bad about helping Facebook?
Therein lies the problem. The hard truth is that Facebook’s own interests diverge — in some cases, wildly — from those of its users due to three major predicaments.
First is Facebook’s business model, which rests on the need to keep consumers engaged in its services on the one hand and the need to monetize the data it gathers by targeting those users with new services and advertising on the other.
Time. Attention. Data. If you are a consumer, that’s what Facebook wants out of you. And yet users generally don’t seek out Facebook’s services with an explicit sense of the scale of the data they’re giving up; nor are they fully aware that what tech companies often call “stickiness” is, in practice, more like addiction. Instead, Facebook’s users come to the service looking for meaningful social connections, news, and entertainment. That’s what allows the company to make sweeping claims like its 2017 pledge to create a “social infrastructure…to build a global community that works for all of us.” It’s language like that that masks the core transaction Facebook requires from its users: your time and your data for our services.
Second is Facebook’s scale, which has thrust upon the company an enormous responsibility that even Zuckerberg now admits is unsustainable. As of December, the company boasted 2.32 billion monthly active users, almost one in three people on the planet. That same month, the company employed a mere 35,587 employees, a ratio of roughly one employee for every 65,000 users. How can a company so small effectively govern and protect such a large digital environment? The answer is it cannot. Massive failures — related to cybersecurity, privacy, propaganda, and more — are simply inevitable at these scales.
Last is a cultural problem, which explains Facebook’s consistent but needless privacy missteps. From unwarranted requests for sensitive data to audacious violations of user privacy, Facebook as a whole has simply not prioritized the security or privacy of its users for reasons than can only be ascribed to culture — to a rush to get new features to market perhaps or to an overly idealistic sense that the company could do no wrong, to paraphrase its COO Sheryl Sandberg.
These unforced errors, in turn, have steadily eroded the trust it would take for the company to fix either of the above core problems.
To be clear, if we’ve arrived at a collective moment of digital discomfort, Facebook is not alone and should not bear all the blame. While some of these issues are unique to Facebook, every major technology company is struggling with some form of them.
For years, consumers and regulators alike failed to appreciate the central trade-off that many tech giants forced on their users. That’s why, referring to the pro-democracy protests that swept the Arab world in 2010, publications like the New York Times would flatly state the “Egyptian revolution began on Facebook.” Social media companies were hailed as great forces for good, both by consumers and by western governments, many of whom used the same flowery language as Facebook to describe the benefits of social media.
It’s only recently that we’ve begun to appreciate all the risks inherent in our adoption of digital technologies. From the growing range of internet-connected objects to our nearly complete reliance on software in finance, aviation, and many other areas, the range of threats to our privacy and our security have become systemic.
In that light, the predicament Facebook is caught in — its relentless attempts to assure the public that it can realign its business model to its users’ interests — is not Facebook’s problem alone. It’s a symptom of a society that adopted a technology too quickly, without understanding its downsides or its risks, as I’ve written about in HBR and elsewhere.
The trick lies in what Facebook and all of us can do next. For consumers and regulators, the answers are starting to become clear. Carefully craft new legislation that increases the privacy and security standards of all software systems, which will in turn slow the pace of adoption of digital technology. Diminish the vast power of companies like Facebook by limiting their ability to hoard the data they collect and to aggregate the services they provide, contracting their “attack surface,” so to speak, to a level that is manageable. This might mean literally disaggregating Facebook’s services and physically separating businesses like WhatsApp from others like Instagram and more.
Over the long term, Facebook’s business model must evolve to center around trust, which means making user privacy and data security as important as monetization. Without one, Facebook will not be able to sustain the other.
In the short term, however, the company is far from that target. And Zuckberg and others have made it clear that, despite their growing appeals to governments, the company has yet to truly grapple with the depth of the problems that it — and we — are confronting.
Facebook and its users alike seem destined to continue their attempts to reframe the bargain they’ve made with each other — a process that might drag on until one or many governments more forcibly step in. Zuckerberg’s latest proposals merely form another small episode in a much longer struggle about that reframing.