In Jeff Orlowski’s documentary “The Social Dilemma,” a group of former employees of Facebook, Twitter, Instagram and other companies describe the precise programming that goes into the recommendations offered on social media platforms.
The documentary sounds the alarm on how recommendation algorithms exploit the human need to connect with people who are like us, who look like us, talk like us, who believe the same things we do. And so, if we follow Facebook’s suggestions, or Amazon’s “Books You May Like,” we become victims of emotional manipulation.
These artificial intelligence-boosted recommendation programs simplify what many of us believe to be a tedious chore: sifting through data to make decisions. So social media companies have taken over. They direct the making of choices to prolong our time on their platforms to make us stick around longer.
Farhad Manjoo, an opinion columnist at the New York Times, lays it out in a tweet: “News flash everything on social media is an editorial decision by the company. Every single thing, the UI, the posts you see, the posts you don’t, the metrics, the speed with which everything updates, the colors, it’s all human decisions all the way down. It’s all subjective.”
What this amounts to is an environment that’s ripe for creating dependencies using suggestion as a tool. And like a still and stagnant pond, this is how, why, and where both misinformation and disinformation flourish.
Similar to what social media companies do, most successful purveyors of disinformation and misinformation are adept at manipulating emotion through suggestions, often harmful suggestions.
And so in 2016, Hillary Clinton became Public Enemy No. 1 at Trump rallies; she was portrayed not as a professional rival or a competitor, but far worse. And what that “worse” entailed was a suggestion left for each individual to resolve.
Partisanship in America has created rigid boundaries, making it possible for conspiracy theorists and misinformation and disinformation agents to use social media as a vehicle to spread their message to those more likely to be affected, amply enabled by social media’s ads, push notifications and friend suggestions.
Remember the Pizzagate incident of the last presidential elections? In 2016, conservative journalists and members of the alt-right suggested on 4Chan, 8Chan, and Twitter that Hillary Clinton and the Democrats were running a human trafficking and child sex ring in the basement of a pizza shop in Washington, D.C., and a man from North Carolina arrived at Comet Ping Pong pizza shop armed and ready to liberate trapped children, only there were no children, and there wasn’t even a basement at the pizza shop.
A few days ago, the Federal Bureau of Investigation dismantled a conspiracy by more than a dozen self-identified militia members, who were preparing to kidnap Michigan Gov. Gretchen Whitmer for her COVID-19 statewide lockdown policies. Whether President Trump’s tweet “LIBERATE MICHIGAN” had anything to do to influence these men in their plotting is still under consideration, though the men did attempt to do just that: liberate Michigan from lockdown. (Admission: I’m parlaying suggestion here.)
At a media briefing on “The Contagion of Hate—The Other Virus in America,” Sandy Close, director of Ethnic Media Services, in her opening remarks, said that “it feels like there’s a general permission to hate and it comes from the top.”
“Words matter,” emphasized John Yang, executive director for Asian Americans Advancing Justice, referring to the many harmful presidential tweets that have targeted individuals and suggested conspiracy theories. Words can be incredibly harmful, “causing issues, causing fear, causing physical harm.”
It makes me believe that hate is an elemental emotion, readily understandable, lying dormant, waiting to be tapped, easily accessible. Hate often stops reason right in its tracks. That’s why hate is wielded most effectively by despots and dictators in the world’s checkered past. And why it spreads so fast.
Yang mentioned a recent study by the Anti-Defamation League, which analyzed more than 2.7 million tweets between Oct. 2 and Oct 5. The results of the study are disturbing, though not entirely surprising. After the first presidential debate when President Trump said that the pandemic was “China’s fault” and used the words “China plague,” there was a spike in anti-Asian tweets. And in the hours following the president’s COVID-19 diagnosis, there was an 85 percent increase in hate tweets and conspiracy theories targeting and involving Asian Americans.
In “The Social Dilemma,” Shoshana Zuboff, Harvard University professor, says ominously that “social media is a marketplace that trades exclusively in human futures.” By this token the future looks bleak because we’re not just talking about social media.
Business in America follows the same manipulative roadmap. It’s the news articles we read, the television shows we watch, the books that are suggested next, the Netflix shows that are recommended, the searches we conduct on Google. Predictive technology has infiltrated every aspect of our lives, and we have no choice but to allow it, since it’s made life simpler for us.
I’m sure Zuboff is right, though perhaps too expansive. But there’s a reason the word “dilemma” is part of Orlowski’s documentary title. And like any dilemma, social media does indeed have benefits. Even the recommendation algorithms are useful on occasion. It’s fine to watch the next show recommended on Netflix, or to read a news article put out by a friend on Facebook. Let’s do it responsibly, though, with an abundance of caution, and critical thinking. Let’s do it by rejecting messages steeped in hate, rage or scorn, even if a trusted person is spouting it.
Hate, like the coronavirus, is a contagion. It grows quickly, virulently. And the only effective vaccine we have is empathy. #VoteForEmpathy.
Jaya Padmanabhan can be reached at firstname.lastname@example.org. Twitter: @jayapadmanabhan. She is a guest columnist and her point of view is not necessarily that of The Examiner.