For years, Facebook employees have identified serious harms and proposed potential fixes. CEO Mark Zuckerberg, pictured in 2019, and COO Sheryl Sandberg have rejected the remedies, causing whisteblowers to multiply. (Eric Thayer/New York Times)

For years, Facebook employees have identified serious harms and proposed potential fixes. CEO Mark Zuckerberg, pictured in 2019, and COO Sheryl Sandberg have rejected the remedies, causing whisteblowers to multiply. (Eric Thayer/New York Times)

Facebook’s problems at the top: Social media giant is not listening to whistleblowers

Whistleblowers multiply, but Zuckerberg and Sandberg don’t heed their warnings

By Paul M. Barrett

Special to The Examiner

The whistleblower whose leaks of internal Facebook documents have made for a riveting — and ongoing — series of exposes by The Wall Street Journal is far from the first former employee to shed unflattering light on the inner workings of the world’s largest social media company.

Facebook has produced a veritable parade of ex-workers sounding the alarm about what they portray as a deeply flawed corporate culture. The flaws matter because they make Facebook a 3-billion-user platform not only for family vacation tales and cat photos, but also for disinformation, divisiveness and hatred.

The still-anonymous whistleblower behind the Journal revelations has proffered hard evidence of a troubling pattern: Employees identify a serious harm caused — often inadvertently — by Facebook’s policies or automated systems. In-house data scientists and engineers propose potential fixes.

But then, top management, sometimes with the involvement of CEO Mark Zuckerberg, rejects the remedies, presumably because they threaten the company’s top priorities: increasing both its user count and the amount of time users spend on the site, liking, sharing and commenting. Not coincidentally, the advertisers that provide nearly all of Facebook’s revenue care a great deal about user volume and engagement.

The whistleblower’s document dump has illustrated this pattern in a variety of contexts: an algorithm revision meant to foster interaction among friends and family that instead heightened hostility and anger; internal research showing Facebook’s awareness of the harm that its Instagram photo-sharing site does to image-conscious teenage girls; and employee alerts to their bosses that, around the world, Facebook is exploited by criminal groups ranging from drug cartels to human traffickers.

In a corporate blog post, Nick Clegg, Facebook’s vice president of global affairs, fired back that the Journal articles “have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees.”

But the Journal’s source is hardly alone in bringing forward disturbing insights about Facebook. Last year, fired data scientist Sophie Zhang dropped a 6,600-word memo on her way out the door, in which she described how executives ignored or were slow to respond to evidence that governments and political parties from Azerbaijan to Honduras use fake Facebook accounts and other forms of deception to manipulate their citizenries.

Ashok Chandwaney, an engineer who worked at Facebook for five years, explained his September 2020 resignation in a letter posted on an internal message board. “I’m quitting,” he wrote, “because I can no longer stomach contributing to an organization that is profiting off hate in the U.S. and globally.” Chandwaney said the company did too little to counter racism and disinformation. “As a result of the company’s obsession with its growth, so many things go wrong,” Adin Rosenberg, another engineer who left last year, said in a farewell post.

Brian Boland, a former vice president for partnerships strategy, quit in November 2020 because of his growing frustration over the company’s resistance to grappling with phenomena like the viral distribution of misinformation about vaccination. Senior management, Boland told CNN, “seems to be more about ‘let’s avoid the story’ or ‘let’s control the narrative,’ rather than ‘let’s do the hard thing.’”

So, what is it about Facebook’s corporate culture that produces this dissent? Three elements stand out:

First, Zuckerberg has fostered a myth that Facebook’s “whole mission” is “bringing the world closer together.” To be fair, the company does help a multitude of people in many countries communicate, organize and educate. But it also intensifies efforts to oppress religious and ethnic minorities and incite violence, as has happened with Facebook and its WhatsApp messaging service in India, Sri Lanka, Ethiopia and other places. The company’s unwillingness to meaningfully acknowledge and address this dissonance is alienating a growing number of employees who otherwise appreciate the intellectual challenges and generous compensation that come with a Facebook ID badge.

What’s more, there’s a strain of thinking among some in the senior ranks that the loftiness of Facebook’s supposed mission justifies a degree of collateral damage. “So we connect more people,” Andrew “Boz” Bosworth, a member of Zuckerberg’s inner circle, wrote in a notorious 2016 memo. “That can be bad if they make it negative. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.” He added: “The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good.”

On Sept. 22 of this year, Zuckerberg named Bosworth the company’s chief technology officer, with authority over the much ballyhooed “metaverse” project, which aims to combine physical, augmented and virtual reality. “I’m excited about the future of this work under Boz’s leadership,” Zuckerberg said in a corporate blog post.

A second unsettling aspect of Facebook’s management culture is an apparent allergy to candor. Days after the Jan. 6 insurrection at the U.S. Capitol, Zuckerberg’s no. 2, Chief Operating Officer Sheryl Sandberg, was asked by an interviewer whether Facebook had moved swiftly enough to cut off Trump supporters who used the platform to incite and organize the violence. Sandberg deflected: “I think these events were largely organized on platforms that don’t have our abilities to stop hate, don’t have our standards and don’t have our transparency,” she said.

Soon thereafter, a raft of criminal indictments alleged that members of the right-wing Oath Keepers militia and other rioters used Facebook and Facebook Messenger to discuss logistics beforehand and coordinate the invasion of the Capitol as it happened. Sandberg, it seems, was badly misinformed about her own company’s role, and subsequently, she never acknowledged the misleading nature of her comments.

In a similar vein, Facebook has insisted for years that it treats all of its users equally, removing content and, occasionally, canceling accounts when its “community standards” are violated. But that’s not true. As documents provided by the Journal’s anonymous whistleblower demonstrate, the company has maintained a program called “XCheck,” which allowed millions of politicians, celebrities and other VIPs to avoid sanctions and, in some cases, spew harassment and incitement of violence. “We are not actually doing what we say we do publicly,” an internal review of the situation concluded in 2019.

A third problem is that Facebook’s leaders appear to operate on the assumption that the company can get away with almost anything, as long as it eventually apologizes and promises to do better. In March 2018, United Nations investigators declared that by spreading religious hatred over a period of years, Facebook played a “determining role” in the ethnic cleansing of Myanmar’s minority Rohingya Muslims by the country’s military. As many as 10,000 Rohingya were killed, and more than 800,000 fled to neighboring Bangladesh.

Facebook executives eventually acknowledged some responsibility for the devastation and took remedial steps like hiring Burmese-speaking content moderators. But by then, of course, it was too late for the Rohingya. In search of user growth, Facebook has expanded into volatile countries around the world before it had a sure grasp on local languages and cultures, let alone on all the ways its services could be exploited for malign purposes.

It’s not difficult to imagine how a conscientious employee might find this disillusioning enough to quit — and then take the further step of speaking out. The question now is whether, after years of fruitless pontificating, lawmakers in Washington, D.C. are finally listening to the Silicon Valley dissenters and are prepared to force Facebook and other social media companies to behave more responsibly.

Paul M. Barrett is deputy director of the Center for Business and Human Rights at New York University’s Stern School of Business. He is the lead author of a recent report about how Facebook and other social media platforms intensify political polarization in the U.S.

Facebooksocial media

Just Posted

Cabernet sauvignon grapes sat in a container after being crushed at Smith-Madrone Winery in St. Helena. (Courtesy Smith-Madrone Winery)
San Francisco’s ‘Champagne problems’ — Wine industry suffers supply chain woes

‘Everywhere you turn, things that were easy are no longer easy’

Glasses behind the bar at LUNA in the Mission District on Friday, Oct. 15, 2021. Glassware is just one of the many things restaurants have had trouble keeping in stock as supply chain problems ripple outward. (Kevin N. Hume/The Examiner)
SF restaurants face product shortages and skyrocketing costs

‘The supply chain crisis has impacted us in almost every way imaginable’

A Giants fans hangs his head in disbelief after the Dodgers won the NLDS in a controversial finish to a tight Game 5. (Chris Victorio/Special to The Examiner)
Giants dream season ends at the hands of the Dodgers, 2-1

A masterful game comes down to the bottom of the ninth, and San Francisco came up short

<strong>Workers with Urban Alchemy and the Downtown Streets Team clean at Seventh and Market streets on Oct. 12. <ins>(Kevin N. Hume/The Examiner)</ins> </strong>
<ins></ins>
Why is it so hard to keep San Francisco’s streets clean?

Some blame bureaucracy, others say it’s the residents’ fault

Speaker of the House Nancy Pelosi — seen in Washington, D.C., on Tuesday — touted Congressional Democrats’ infrastructure bill in San Francisco on Thursday. (Stefani Reynolds/The New York Times)
Pelosi touts infrastructure bill as it nears finish line

Climate change, social safety net among major priorities of Democrats’ 10-year funding measure

Most Read