We have a plan to regulate Facebook

The ‘drumbeat for accountability’ has become too loud to ignore

By Paul M. Barrett

Special to The Examiner

Facebook has gone “Meta.” But whatever founder and chief executive Mark Zuckerberg calls the world’s largest social media company, it continues to amplify misinformation, hatred and divisiveness. U.S. lawmakers and regulators need to act urgently to reduce the spread of these and other forms of harmful online content.

Courtesy of whistleblower Frances Haugen, we now have much more evidence of the negative consequences of Zuckerberg’s products, particularly around the spread of hateful speech and the intensification of political polarization. We also know that the company’s own researchers have identified these consequences, but executives often have failed to act and publicly minimized the dangers. According to Sen. Richard Blumenthal (D-CT), Haugen’s revelations have contributed to “a building drumbeat for accountability.”

Facebook knows that it is intensifying political sectarianism. “Our algorithms exploit the human brain’s attraction to divisiveness,” a slide from a 2018 internal presentation stated. Without changes, the presentation added, the company’s automated systems would steer users to “more and more divisive content in an effort to gain user attention & increase time on the platform.”

Zuckerberg and his top lieutenants have challenged the recent revelations as exaggerated, taken out of context and designed to damage the company. But the Federal Trade Commission takes the Haugen documents seriously. The FTC has launched an investigation into whether Facebook’s failure to disclose what it knew about its harmful effects constitutes a violation of U.S. consumer protection law, which prohibits “unfair or deceptive” commercial practices. The FTC’s welcome response points toward a broader regulatory agenda for a social media industry that to date has escaped any sustained government oversight. The time has arrived to devise a plan for regulation.

First, Congress should increase funding for the chronically cash-strapped FTC, so that the agency can hire more technologists and experienced investigators capable of understanding how the algorithms work — and periodically change — at Facebook, YouTube, Twitter and other influential platforms. President Biden’s pending social safety net bill would allocate $500 million for the FTC to expand its protection of online privacy and data security; assuming that the provision survives, some of this money should support this type of oversight.

Second, lawmakers should enhance the FTC’s consumer protection authority, as outlined in recommendations published earlier this year by a working group led by New York University’s Stern Center for Business and Human Rights and the Mossavar-Rahmani Center for Business and Government at Harvard Kennedy School. Specifically, Congress should require large social media companies to collaborate with the FTC in developing voluntary industry-wide standards of conduct, which the agency would have the power to enforce. Lawmakers would need to modify the industry’s existing liability shield under Section 230 of the Communications Decency Act to ensure the viability of FTC enforcement actions.

The industry standards should oblige the companies to reveal how their currently secret automated systems rank, recommend and remove content. Greater transparency would allow Congress, regulators, academic researchers and the public to better assess social media’s positive and negative effects on society. Other industries, including chemicals, food and energy, make mandatory safety disclosures to the government, and social media should be no different.

Beyond transparency, the industry standards should set limits on the prevalence of various forms of harmful content, meaning how often users actually encounter hate speech, harassment and the like — a measurement that Facebook has actually endorsed. Failure to reduce prevalence of pernicious posts would lead to publicly announced penalties. The standards should also for the first time establish minimum protections of user privacy, an area where the FTC has already been active — for example, when it imposed a one-off $5 billion penalty against Facebook in 2019 in the wake of the Cambridge Analytica scandal.

Thinking even more ambitiously, Congress should consider forming a fully independent Digital Platform Agency. Doing so would underscore the importance of overseeing powerful and influential social media companies, much as creation of the Federal Communications Commission and Securities and Exchange Commission in the 1930s signaled the urgent need for oversight of broadcast media and equity markets.

Lawmakers already have introduced a number of bills containing elements of this proposal. To be sure, significant complications remain — chiefly, concerns about the government making content decisions, which the First Amendment expressly bans. But judicious legislative drafting should make it possible to avoid official intrusion on free speech. The drumbeat for accountability in the social media industry has become too loud for Congress to ignore.

Paul M. Barrett is deputy director of the NYU Stern Center for Business and Human Rights and lead author of a recently published white paper entitled, Fueling the Fire: How Social Media Intensifies Political Polarization in the U.S. — And What Can be Done About It.

‘King Tides’ give San Francisco a watery glimpse of its future

City seeks solutions as coastal flooding could become the new normal

By Jessica Wolfrom
Dire water warnings confront San Francisco and beyond

‘We will face challenges that I don’t think modern California has ever really seen before’

By Jessica Wolfrom
Debate over $150 million San Francisco school district windfall

‘We’re looking at the ways to boost morale’

By Ida Mojadad