Instead of human editors and consumers determining content, algorithms are driving what posts appear in your feed, based on what you’re expected to engage. (Courtesy photo)

Your social media feed is anything but random

#Disinfoweek. It’s a thing. It’s happening now.

And here’s a phrase I just heard for the first time: “Speaking truth to social media.”

Think tanks are now considering the impact of fake news on democracies and what can be done about it. In fact, this week, they’re getting experts in social media, communications and technology to put their heads together and try to come up with ideas at #DisinfoWeek in a series of events, described by the organizers as “a week-long set of strategic dialogues on how to collectively address the global challenge of disinformation.”

The National Democratic Institute, along with other organizations including the Atlantic Council, the Oxford Institute and Stanford, are hosting panels this week in Washington, D.C., London and the Stanford campus in Palo Alto. NDI describes itself as a “nonpartisan, nongovernmental organization that responds to the aspirations of people around the world to live in democratic societies with open and multiparty political systems that recognize and promote basic human rights.” The Atlantic Council includes in its mission that it “provides an essential forum for navigating the dramatic economic and political changes defining the twenty-first century by informing and galvanizing its uniquely influential network of global leaders.” These are very “Big” ideas.

And these Big-thinking think tanks are trying to tackle a big problem we are just now getting our spinning heads around.

On Tuesday, NDI President Ken Wallach reflected on the sentiments of Arab Spring online organizer Wael Ghonim in his opening remarks to highlight how the early optimism around the internet’s ability to liberate people has morphed into fear of how easy it is to divide people with the craft of manipulation and the dread of what’s to come.

In 2011, the spark that inspired Ghonim to help launch a movement was a photo of a young man who had been tortured and killed by the Egyptian police; he realized that he could be next. Social media had helped accelerate Tunisia’s ousting of its president in 2010; the potential for social media to help kickstart a movement had already been witnessed.

Ghonim created an anonymous Facebook page that gained more than 100,000 followers within three days, powering the movement that ultimately drove President Hosni Mubarak to step down after weeks of sustained unrest.

After facing relentless attacks and threats and witnessing firsthand how easily a righteous cause can descend into chaos and hatred, Ghonim left social media entirely and went silent for more than two years. During this time, he wanted to understand how social media became the fuel propelling the polarization that is proving to be destabilizing and what could be done about it. In 2015, he gave a TED Talk reflecting on what he had observed.

Early in the days of the movement, Ghonim said, “If you want to liberate a society, all you need is the internet.” Now, he says something different, acknowledging that things are more complex and that “… we need to also think about how to design social media experiences that promote civility and reward thoughtfulness.”

One thing Ghonim has pointed out is that incendiary posts and opinions are more likely to be promoted, in part, because of the platform algorithms designed to optimize user engagement. So instead of human editors and consumers determining content, algorithms are driving what posts appear in your feed, based on what you’re expected to engage. And smart people running “bot” operations of fake social media networks are set up specifically to amplify certain messages based on these programs. It is entirely possible that some people you know are posting some thought-provoking, dialogue-promoting posts and tweets — ones you will never see.

I seriously doubt most people understand that what appears in their social media feeds and online isn’t random.

In addition to training people spot and discount fake news, helping social media users understand that what they are getting served up on their social media platforms is designed to feed the bottom line of the companies providing the platforms seems like an important piece of media literacy to combat propaganda, too. If they are the public square, they are also the gatekeepers deciding who gets to speak.

Many Americans are just waking up to the fact that social media is a major source of propaganda; every day, more information trickles out about how much manipulation went on leading up to the 2016 election — and its not slowing down.

Maybe #Disinfoweek will shed some more light. Social media platforms are generally loathe to do anything about the pollution trafficked on their highways, so maybe users will have to crowdsource this problem out, too. Nobody enjoys feeling like they’ve been played.

Maureen Erwin is a Bay Area political consultant. Most recently she led Sonoma County’s Measure M, which will create the largest GMO-free growing zone in the U.S.

If you find our journalism valuable and relevant, please consider joining our Examiner membership program.
Find out more at

Just Posted

San Francisco schools estimate $25m needed to close digital divide

Private donations being sought to fund internet access, technology

Retired officers seek end to decade-old age discrimination case

Retired officer Juanita Stockwell was 60 when she and her colleagues first… Continue reading

State officials sue Uber and Lyft for wage theft

California Labor Commissioner’s Office alleges ride-hail companies violating AB5

SFMTA Board passes ‘clean’ sales tax ballot measure to fund Caltrain

After a weeks-long impasse, tri-county leadership appears to have reached a compromise

Struggling Chinatown restaurants get help moving business outdoors

A total of $25,000 will be distributed to assist with costs of Shared Spaces program

Most Read