Donald Trump won the U.S. presidential election last night. This is terrible news for the country and I am horrified by his victory. In particular, I’m having trouble processing his obviously widespread support given the many negative attributes he has displayed throughout his life and during the election. While we’re looking around for whom to blame for how things turned out (and there will be plenty of finger-pointing), I believe Facebook, Twitter, Google, etc. and the entire culture of information “personalization” should be counted among the blameworthy. True, there are a number of complex sociological factors in play in any election, and the combination of Trump’s celebrity appeal and populist messaging seem to have had a powerful effect on a lot of people. But here’s the thing: Many of us were unprepared for this result. We looked on with wonder as Trump won in the primaries and went on to become a popular candidate in the general election. Many of us were shocked and surprised by the outcome of the election. Did you, like me, experience ongoing shock and disbelief over Trump’s consistently competitive poll numbers even after allegations of sexual assault and the array of other deeply negative revelations about him? If so, it might be because you and I live in a media bubble built out of algorithmic profiling, an echo chamber designed to soothe us with an overwhelming number of messages that we agree with, or are pretty dang close.
When you view your Facebook “newsfeed,” you’re not viewing every post of every person you are connected to on the network. Facebook’s learning algorithms access thousands of data points about your past behavior on Facebook and your interactions with other websites, merchants, and mobile services to identify your tastes and preferences. The resulting newsfeed you and I see contains only posts that Facebook believes are the most “relevant” to each of us. On Facebook, we’re mainly connected only with people who we have identified as “friends” and what we hear from them (and they from us) is winnowed down into a preference-focused feed that may not even include the contrary views of people we know. When we perform searches on Google, another collection of personalization algorithms massage our search results to conform to what the system believes each of us wants to see. Twitter users can choose whose posts to follow, enabling users to curate their information sources into narrowly defined subjects and communities. In my Twitter feed, I only follow academics and research institutions working in my field, plus a few journalists and news sources whose reporting appeals to me.
Getting news this way is completely different from traditional journalism, where the goal, ideally, is to provide readers and viewers with a diverse range of ideas and multiple viewpoints. On commercial information services the information we receive is narrowly restricted and designed to please each of us individually. (Much of this will not be news to anyone who has read Eli Pariser’s “The Filter Bubble.”) The goal of customizing our various search results, feeds, and follows is to keep us online, staying engaged with whichever service we’re using, clicking links, viewing ads, buying things. The more time we remain engaged this way, the more information about our preferences and inclinations that can be translated into advertising dollars. The result of all this customization is that each of us is experiencing very different information flows from people who disagree with us – flows designed to keep each consumer engaged and to limit any feelings of discomfort. If you think Hillary Clinton is dishonest, it’s likely reflected in your online media choices and personalizations, and you’re unlikely to see posts or articles that champion her as a person of integrity.
Democratic deliberation requires the airing of a plurality of ideas and room for meaningful debate on the merits. It is still true that people are more likely to find common ground and back down from extreme positions if given the chance to truly understand each other. It is also true that customized information sources are as likely as not to include easily disputed rumors and distortions that would become apparent if more viewpoints were available for consideration. This is not what is happening. Unfortunately for democratic deliberation, the discomforting effects of stories and worldviews that don’t conform to our biases are bad for the online business model. If the goal is to keep people where they are, engaged and consuming what you’re offering, it doesn’t make business sense to question or challenge them and their version of reality.
Our media elites used to do a decent job of providing us with a plurality of views. Traditional journalism is far from perfect; media biases and filters are not new. But there were (and still are) journalistic institutions dedicated to reporting more fact than rumor, and for presenting multiple viewpoints on contentious questions. When that system was more functional than it is now, while I might not agree with the opposing viewpoint, I could at least come to understand and engage with it. Similarly, people on the other side of an issue might come to understand a piece of my truth. But traditional journalism is in decline. Fewer of us are relying on well-established media sources that can legitimately claim to be objective or balanced. One outgrowth of this is that some of the remaining media institutions have become clownish and shallow, more interested in salacious gossip and to pleasing political leaders in return for “access” than to soberly analyzing their views and statements.
As my old friend David Newhoff points out in his blog, viewing the world through the filter of commercial information platforms, including social media, makes it “very hard to distinguish between being vigilantly informed and hysterically manipulated.” As more of us come to get most of our political news from these platforms, whose shared mission is to harvest and monetize information from us and not to inform us, we will continue to fail at gaining a thorough understanding of what comes blasting out of the fire hose. Still more problematic, we’ll also continue to fail to truly understand what the other side actually thinks, resorting instead to caricatures and hyperbole. We are going to see the results of this filtering effect repeated again and again with the result that we become weaker advocates for our causes and candidates. This is not making us smarter. It is making us naive and vulnerable.
While we’re busy pontificating (myself included) on social media about our views and sharing our carefully curated information tidbits with our online followers and friends, remember that this narrowly focused information sharing is a central problem for political discourse. Despite the potential for sharing our views with more people than most of us could have hoped to before these platforms existed, the intentional limiting of our feeds and searches by platform operators means that what we say, do, and seek in the information space is not likely to escape the comfort of our individual echo chambers. We’re just yelling at ourselves while generating revenue for others and carving out ever-tinier slices of an increasingly subjective reality.