What to do about Facebook, and what not to do

The company is among the largest collectors of humanity's most private information, one of the planet's most-trafficked sources of news, and it seems to possess the ability, in some degree, to alter public discourse

By Farhad Manjoo
Published: Nov 6, 2021

Facebook employees unveil a new logo and the name 'Meta' on the sign in front of Facebook headquarters on October 28, 2021 in Menlo Park, California. A new name and logo were unveiled at Facebook headquarters after a much anticipated name change for the social media platform.

Image: Justin Sullivan/Getty Images 

One of the most unsettling revelations in the cache of internal documents leaked by former Facebook employee Frances Haugen has been just how little we know about Facebook, and consequently how unprepared our political culture is to do anything about it, whatever it is.

That’s the first problem in fixing Facebook — there isn’t much agreement about what, exactly, the problem with Facebook is. The left says it’s Facebook’s amplification of hate, extremism and misinformation about, among other things, vaccines and the last presidential election. President Joe Biden put it bluntly this summer: “They’re killing people.”

Former President Donald Trump and others on the right say the opposite: Social media giants are run by liberals bent on silencing opposing views. In a statement last week, Trump called Mark Zuckerberg, Facebook’s founder, “a criminal” who altered “the course of a Presidential Election.”

Beyond concerns about the distortion of domestic politics, there are a number of other questions about Facebook, Instagram and WhatsApp — all of which, Zuckerberg announced last week, are now under a new corporate umbrella called Meta. Is Instagram contributing to anxiety and body-shaming among teenagers? Are Facebook’s outrage-juicing algorithms destabilizing developing countries, where the company employs fewer resources to monitor its platform than it does in its large markets? Is Facebook perpetuating racism through biased algorithms? Is it the cause of global polarization, splitting societies into uncooperative in-groups?

Read More

Inherent in these concerns is a broader worry — Facebook’s alarming power. The company is among the largest collectors of humanity’s most private information, one of the planet’s most-trafficked sources of news, and it seems to possess the ability, in some degree, to alter public discourse. Worse, essentially all of Facebook’s power is vested in Zuckerberg alone. This feels intolerable; as the philosopher Kanye West put it, “No one man should have all that power.”

So, what to do about all this? In the past few days I asked more than a dozen experts this question. Here are some of their top ideas, and what I think about them.

Break it up

Under the tech-friendly Obama administration, the Justice Department and the Federal Trade Commission allowed Facebook to swallow up quick-growing potential rivals.

Splitting Facebook into three or more independent companies would undo that regulatory misstep and instantly reduce Zuckerberg’s power over global discourse.

It could also improve the tenor of social media, as the newly independent networks “would compete with each other by differentiating themselves as better and safer products,” said Matt Stoller, director of research at the American Economic Liberties Project, an anti-monopoly advocacy group.

Still, as Stoller notes, a breakup might be a necessary measure, but it’s hardly sufficient; competition notwithstanding, after a split we’d be left with three networks that retain Facebook’s mountainous data and its many corporate pathologies.

The breakup plan also faces steep hurdles. Over the last few decades, American antitrust law has grown fecklessly friendly to corporations. It’s unclear how to undo that. In June, a federal judge threw out sprawling antitrust cases against Facebook brought by the FTC and 40 states, saying that they had failed to prove that Facebook is a social media monopoly.

Place limits on its content

Imposing rules for what Facebook can and cannot publish or amplify has been a hot topic among politicians. Democrats in Congress have introduced proposals to police misinformation on Facebook, while lawmakers in Texas and Florida have attempted to bar social media companies from kicking people off for speech offenses, among them Trump.

As I wrote last week, these policies give me the creeps, since they inevitably involve the government imposing rules on speech. Just about all of them seem to violate the First Amendment.

Yet bizarrely, content rules have become the leading proposals for fixing Facebook; repeal of Section 230 of the Communications Decency Act — which limits tech platforms’ liability for damages stemming from content posted by users — is often described as a panacea.

Among the many ways to address Facebook’s ills, speech rules are the least palatable.

Regulate ‘surveillance capitalism’

Here is a seemingly obvious way to cut Facebook at the knees: Prohibit it from collecting and saving the data it has on us, thereby severely hampering its primary business, targeted advertising.

The rationale for this is straightforward. Imagine we determine that the societal harms generated by “surveillance capitalism,” Harvard professor Shoshana Zuboff’s aptly creepy label for the ad-tech business, poses a collective danger to public safety. In other such industries — automobiles, pharmaceuticals, financial products — we mitigate harms through heavy regulation; the digital ad industry, meanwhile, faces few limits on its conduct.

So let’s change that. Congress could impose broad rules on how ad behemoths like Facebook and Google collect, save and use personal information. Perhaps more important, it could create a regulatory agency with resources to investigate and enforce the rules.

“At a minimum,” said Roger McNamee, an early Facebook investor who is now one of its most vocal critics, regulators should ban second and third party uses of the most intimate data, “such as health, location, browser history and app data.”

Privacy rules are one of the primary ways European regulators have attempted to curb social media’s effects. So why don’t we hear more about it in America?

I suspect it’s because this is a bigger-than-Facebook solution. All the tech giants — even Apple, which has criticized the digital ad business’s hunger for private data — make billions of dollars from ads, and there are lots of other companies that have grown dependent on ad targeting. When California attempted to improve consumer privacy, corporate lobbyists pushed to get the rules watered down. I worry that Congress wouldn’t fare much better.

Force it to release internal data

Nathaniel Persily, a professor at Stanford Law School, has a neat way of describing the most basic problem in policing Facebook: “At present,” Persily has written, “we do not know even what we do not know” about social media’s effect on the world.

Persily proposes piercing the black box before we do anything else. He has written draft legislation that would compel large tech platforms to provide to outside researchers a range of data about what users see on the service, how they engage with it, and what information the platform provides to advertisers and governments.

Rashad Robinson, president of the civil rights advocacy group Color of Charge, favored another proposed law, the Algorithmic Justice and Online Platform Transparency Act, which would also require that platforms release data about how they collect and use personal information about, among other demographic categories, users’ race, ethnicity, sex, religion, gender identity, sexual orientation and disability status, in order to show whether their systems are being applied in discriminatory ways.

Tech companies savor secrecy, but other than their opposition it’s difficult to think of many downsides to transparency mandates. Even if we do nothing to change how Facebook operates, we should at least find out what it’s doing.

Improve digital literacy

Renée DiResta, technical research manager at the Stanford Internet Observatory and a longtime scholar of the anti-vaccine movement’s digital presence, described one idea as “unsexy but important”: Educating the public to resist believing everything they see online.

This is not just a thing for schools; some of the most egregious amplifiers of online mendacity are older people.

What we need, then, is something like a society-wide effort to teach people how to process digital information. For instance, Mike Caulfield, an expert on digital literacy at the University of Washington, has developed a four-step process called SIFT to assess the veracity of information. After Caufield’s process becomes ingrained in his students, he has said, “we’re seeing students come to better judgments about sources and claims in 90 seconds than they used to in 20 minutes.”

Do nothing

In his new book, “Tech Panic: Why We Shouldn’t Fear Facebook and the Future,” Robby Soave, an editor at Reason magazine, argues that the media and lawmakers have become too worked up about the dangers posed by Facebook.

He doesn’t disagree that the company’s rise has had some terrible effects, but he worries that some proposals could exacerbate Facebook’s dominance — a point with which I agree.

The best remedy for Facebook, Soave told me in an email, is to “do nothing, and watch as Facebook gradually collapses on its own.”

Soave’s argument is not unreasonable. Once-indomitable tech companies have fallen before. Facebook still makes lots of money, but it has lost consumers’ trust, its employees are upset and leaking left and right, and because most of its popular products were acquired through acquisitions — which regulators are likely to bar in the future — it seems unlikely to innovate its way out of its troubles.

I don’t agree with Soave that we should do absolutely nothing about Facebook. I would favor strong privacy and transparency rules.

But Soave will probably get what he wants. As long as there’s wide disagreement among politicians about how to address Facebook’s ills, doing nothing might be the likeliest outcome.

©2019 New York Times News Service

X