Algorithms of Fear
Each week, it seems, brings some new incendiary rhetoric in our presidential election cycle or some new global political disruption emerging from the overlapping social, economic, technological, and environmental systems that govern our lives. Each disruption seems to shock us; and the most severe of them become political milestones, referred to across our digital networks in a kind of digital shorthand: Zika, Trump's wall, Ferguson, Flint. And now, of course, there's Orlando.
Yet despite all of the displays of solidarity and outrage that digitally trail each new disruption or debate, civic technologists are becoming convinced they need to do much more, much faster, to help make the Internet a greater force for civility, resiliency, and broader civic engagement.
One big reason for that, say many civic technologists, is Donald Trump and his high-boil candidacy for the White House. Another reason is that we now live in a world run by algorithms—computer programs that make decisions or solve problems for us. They decide our credit scores; they determine stock prices and even the movies we watch. And especially in this Trump-driven presidential election cycle, say civic technologists, media companies are using content algorithms to drive readers to a lot of highly emotional content that is triggering human behaviors meant to accentuate our differences and stoke our fears—in part, to better hold our attention; people tend to share more content if it makes them afraid or angry, they say.
The trouble is, fear is the most powerful enemy of reason in a democratic society, says Micah Sifry, co-founder of the Personal Democracy Forum (PDF), an annual conference that follows how the Internet is influencing politics, governance, and advocacy. "We should be helping those online learn how to navigate their differences other than driving them into daily or hourly flame wars," he said at the annual PDF gathering in New York over the weekend. Sifry believes it's time to create new algorithms and data tools to help find new ways "to ease polarization when there are differences" online.
"The issue of how technology is now affecting our awareness and our responses to triggers in the news, or our personal triggers, and what that is doing to our politics, is a really critical issue right now," Sifry said in an interview. "Digital media that we were all celebrating a few years ago as being better, more social, more personal, and more decentralized than the old broadcast media are now being re-centralized through channels designed by giant companies like Facebook and Google." Bots that have the effect of accentuating our differences online might be good for the media business, Sifry says, "but they're not so great for democracy."
Wael Ghonim, the Egyptian activist and former Google executive who helped to catalyze the 2011 Tahrir Square pro-democracy demonstrations in Egypt, says US social media companies are using what he calls "mobocratic algorithms" that give us a "missing middle" between the extremes of "likes" and "comments"—which Ghonim says tend to quickly devolve into flame wars on social and digital media platforms. In an interview, Ghonim said, "We who use the Internet now 'like' or we flame—but there's [very little] now happening [algorithmically] to drive people into the more consensus-based, productive discussion we need to have, to help us make civic progress. Productive discussions aren't getting the [media] distribution they deserve. We're not driving people to content that could help us, as a society ... come together without a flame war."