What if, contrary to popular belief, experiences in life actually narrowed a person’s perspective rather than broadening it? And what if, in this digital era, the range of information available to us were shrinking rather than expanding – thereby aggravating the disturbing trends toward political, social and economic polarization worldwide?
Indeed, today we may well be witnessing a new wave in history characterized by snowballing momentum toward greater, deeper divisions. Not only can the lack of consensus lead to US government shutdowns, but also conversations around issues such as gun control and immigration reach bitter, protracted stalemates.
The media, with its notorious penchant for left or right and its license to interpret events, has often been cited as a contributor to this polarization process. Naturally, audiences gravitate toward news outlets presenting views closest to their own. It is a fact of life, and there is little if anything we can do to control this.
Less known is the extent to which this plays out in the social media-sphere. It is much harder to make categorical choices about the content we will stumble upon on Twitter, YouTube, or Facebook. Yet whether we are aware of it or not, social media is exacerbating the chasms pitting left against right, pro-choice against pro-life and NRA supporters against those who campaign for more gun control.
In a world where millions upon millions interact, share views, and learn about the world through these platforms, we assume that the variety of content we receive is exquisitely unlimited. On the contrary: the way we use social networks and the way these networks present us, in turn, with content we’re likely to consume produces what author Eli Pariser calls “filter bubbles.”
The concept aptly applies to various degrees to all online social networks. These networks quickly learn our preferences when we engage with our friends and connections — who are likely to share our views. When we “like” a post, the network we are using will make sure that we receive similar content in the future, essentially creating an information bubble around us. As the Wall Street Journal noted, scholars worry that this can create “echo chambers” where users see posts only from like-minded friends and media sources.
To be affected by social media filters and bubbles, users do not even have to engage with the content. Matt Honan, a writer at Wired, discovered that by liking everything he came across on Facebook for two days he not only saw his feed change dramatically, but his behavior ended up influencing his connections’ feeds as well. Thank you, intertwined algorithms.
YouTube is no different. We recently decided to watch a couple of videos promoting Brexit. After a short time, we observed there were no more Remain-themed videos among the recommendations listed on the page where we landed. The more videos we watched from the recommended list, the more extreme the filtering became: even the home page soon filled with Brexit-leaning content. Content about Remain reduced to practically zero. Crucially, we did not even have to “like” anything to get to this point.
Unfortunately, the problem goes even deeper. When we do experience views different from our own in our social media feeds, these tend to be extreme and unrepresentative. For example, the Remain-leaning content shared extensively by Brexit supporters (or vice versa) during this campaign tended to be hard-core and excessive, with over-the-top messages expressed in exaggerated ways. This further alienates people who do not share the content’s views, deepening the divides. Then, in a vicious cycle, social networks’ algorithms detect the resulting polarization and feed into it, leaving little chance for extreme content to be challenged. Positions harden, ideologies calcify.
A Problem of Learning From Experience
This is dangerous, mainly due to the process by which our brains intake and interpret experience. We are hardwired to learn from experience, seamlessly and automatically. Our senses collect the information offered by the environment around us and feed our intuition, which then fuels our judgments, decisions, and behavior. Unfortunately, we typically do not take the time to identify possible biases in what we observe and to think critically about our experience. Nobel Laureate Daniel Kahneman calls this syndrome WYSIATI (What You See Is All There Is). Hence, when we are exposed to censored or filtered information our perception and intuition are profoundly affected. We are led astray by our own experience.
If constant exposure to views that are either in line — or extremely out of sync — with our own is what is shaping our intuition. Additionally, with the opportunity to receive content that balances out our personal predilections in a more moderate and credible fashion constantly shrinking, how can we hope to reverse direction and move closer to a place of mutual understanding or even consensus? Especially given that, most of us do not actively seek to learn about or understand views that oppose our own.
The Quest For Solutions
Just as online social networks are, unexpectedly, one of the drivers of polarization; can they also have the potential to offer us solutions? While the problem of filter bubbles and the resulting polarization have been widely discussed since the early 2000s, to date only a few solutions have been explored.
One attempt is by Facebook, which recently announced that it would build algorithms to reduce clickbait: news stories built primarily to attract attention. From the point of view of experience, this move could reduce the effect of extreme, unrepresentative, and controversial content on users’ intuitions.
In another attempt, computer scientists Eduardo Graells-Garrido, Mounia Lalmas, and Daniel Quercia have envisioned a reconciliatory recommendation system that helps people identify various common interests they share with those who hold opposite views about sensitive and polarized topics.
A recent study by information scientists Sean Munson, Stephanie Lee, and Paul Resnick analyzed the effectiveness of a browser extension they named “Balancer,” which provides users with feedback about the political lean of their past reading behavior. They report that their tool has indeed encouraged some users to widen their perspective by reading more about opposing views.
Finally, there’s the online news aggregation tool NewsCube, which allows readers to combine multiple sides of any given issue through a single, easy-to-consume interface.
Albeit encouraging, if you are hearing about these proposed solutions for the first time here, it means that we need more of them.
The Yin Yang Button: An Algorithm For Common Ground
We propose another idea, one that aims to unbiased experience. Consider a new social media button, similar to the “like” button, designed as a yin yang symbol. The ancient Chinese philosophy of yin and yang is renowned for describing how opposite or contrary forces are actually complementary, interconnected and interdependent. Likewise, the yin yang button could provide balance and completion to one’s experience. Clicking it on any Facebook, YouTube or Twitter post would signal that you find that the post’s content provides a relatively balanced representation of multiple views and seeks primarily to find common ground among them. For instance, Vox’s recent article, “You can grieve senseless violence against police and from police. Really,” which argues that one can simultaneously oppose violence both from and towards the police, would presumably receive a high number of “yin yang” clicks.
Then social media’s algorithms would go to work to ensure that content receiving a high number of “yin yang” clicks would be recommended to everybody following related news stories, regardless of what they saw or “liked” previously on the issue. Of course, there may be users whose own extreme views skew their interpretation of what makes for “balanced” content or what constitutes “common ground,” so a certain critical mass of “yin yang” clicks would be needed for this process to kick in. It might also be helpful if users could sort their searches based on content’s yin yang ratings.
The result would be to burst the social media filter bubble by rounding out our experience around news and current issues, introducing more variety into the information flow that feeds our intuition, judgment, decisions, and behavior. We would thus become better equipped and better able to compromise and seek common ground.
By mitigating the influx of extreme content we receive and allowing us to identify and discard it, this mechanism would also help avoid misperceptions about views and ideologies we do not share.
Philosopher Marshall McLuhan famously said, “We shape our tools and thereafter our tools shape us.” Hence over time, perhaps, the yin yang button could in itself become a driver of more empathetic and common ground-seeking content as users, motivated by yin yang clicks in the same way so many of us are motivated to collect “likes” or “thumbs up” seek to create posts more likely to receive them.
Perhaps, too, those users that consistently produce yin yang-friendly content might emerge as a new generation of thought leaders, akin to LinkedIn influencers or Most Viewed Writers in Quora, whose voice in the conversation might ultimately help bridge the profoundly troubling divides we are facing.
By Emre Soyer and Robin M. Hogarth
(Edited by Cherese Jackson)
Emre Soyer is a behavioral scientist and assistant professor of judgment and decision-making in the Business Faculty at Turkey’s Ozyegin University. His articles have been published in Harvard Business Review and MIT Sloan Management Review, and he has given TEDxOZU and TEDxIstanbul talks. In addition to teaching and researching, Soyer consults for organizations about behavioral strategies, decision-making, creativity, entrepreneurship, and experimental methods and is working with Robin M. Hogarth on a related book.
Robin M. Hogarth is a cognitive psychologist and emeritus professor in the Department of Economics and Business at Barcelona’s Pompeu Fabra University. He is a Fellow of the Association for Psychological Science and one of the founders of judgment and decision-making literature, with more than a hundred influential research articles and six books on the topic to his name. From 1979 to 2001, he served on the faculty of the Booth School of Business at the University of Chicago, where his positions included Deputy Dean, Director of the Center for Decision Research and Wallace W. Booth Professor of Behavioral Science.
University of Washington: Encouraging Reading of Diverse Political Viewpoints with a Browser Widget
Vox Media: You can grieve senseless violence against police and from police. Really
Facebook Newsroom: News Feed FYI: Further Reducing Clickbait in Feed
Wall Street Journal: Blue Feed, Red Feed
Wired: I Liked Everything I Saw on Facebook for Two Days. Here’s What it Did to Me.
Top Image Courtesy of Yoel Ben-Avraham – Flickr License
First Inline Image Courtesy of Surian Soosay – Flickr License
Second Inline Image Courtesy of DonkeyHotey – Flickr License
Featured Image Courtesy of Esther Vargas – Flickr License
Author Images Used With Permission