Five Keys to Penetrating the Facebook Echo Chamber
In a polarized world where Internet flame wars break out in an instant and reports of “fake news” permeate the media, people retreat to places where they feel safe and respected, leading to echo chambers where only similar views - some quite extreme - are prevalent. How can we burst these bubbles and bring people closer together?
Five Keys to Penetrating the Facebook Echo ChamberBy Dennis Huang
The shooting at Marjory Stoneman Douglas High School in Parkland, Florida is the latest evidence of serious communication barriers in today’s society.
Following the tragedy, which left 17 dead and 14 wounded, student survivors have come out and demanded that political figures get serious about the prevalence of guns in the United States, culminating most recently in a nationwide “March for Our Lives” campaign (hashtag: #NeverAgain). Emma González, one of the students spearheading the movement, has accumulated over one million Twitter followers since the incident.
Meanwhile, Internet trolls have spread a constant stream of conspiracy theories about the shooting and the students, making absurd claims such as ‘They are not real high school students, but extras,’ and ‘No shooting actually took place; it’s all just political lies from the Radical Left.’ One such video posted to YouTube, accusing student David Hogg of being a “crisis actor,” even made it to the top of the site’s hottest videos ranking before it was flagged by users and ultimately removed.
Aside from extreme right-wing groups, according to various Internet monitoring outfits, numerous Twitter bot accounts connected to the Russian government have joined the fray to add fuel to the conspiracy theory fire, even playing both sides with inflammatory statements on both the pro-gun and gun control sides in an effort to expand the rhetorical divide.
In recent years, such issues as “fake news” and “echo chambers” have dogged the media, even spreading as far as Taiwan’s political stage. Today, the debate and controversy over gun legislation in the wake of the latest shooting tragedy once again exposes the hole in the ozone layer of information in cyberspace.
Such far-fetched conspiracy theories as “all the shootings are staged” spread like wildfire among certain groups of people, augmenting the echo chamber effect via on-line sharing.
Still, holes creating by technology and human nature can only be repaired through technology and human nature. Responsibility cannot be placed entirely on social media platforms; rather, individual Internet users, technological innovation, and professional media must take action together to turn the tide on these issues.
The following is a breakdown of five critical contributing factors to echo chambers, and five essential prescriptions for action:
First: Identify the Issues
The origins, forms and motivations behind these chaotic messages run too broad a gamut to simply be dismissed as “fake news,” an overly simplified and misleading term that is even used by some politicians to latch onto as an excuse to evade responsibility. For such reasons, political communications researchers advocate using the term “disinformation” to cover instances from made-up information to innuendo and slander, extreme views, and political propaganda.
In discussing disinformation, it is necessary to distinguish media misinformation, made-up information disguised as news, extreme views or hate speech, and slanderous propaganda from politicians or hostile foreign countries in order to gain a clear grasp on the nature of the problem.
Next: Reclaim User Proactivity on Social Media
Surveys indicate that Facebook is the main information platform for 61 percent of American millennials. However, the algorithms employed by Facebook, Twitter and other social media platforms are key contributing factors to the resultant “information bubble” and “echo chamber” phenomena.
Multiple counter experiments are being run to see if technology can be used to get around the bubbles produced by social media platform algorithms. Examples in Taiwan include News Helper and Cofacts. Ethan Zuckerman, co-founder of Global Voices, a global blogging platform, currently serves as director of the MIT Center for Civic Media and principal media scientist at the MIT Media Lab. Zuckerman created an integrated social media platform called Gobo that lets users link their Facebook and Twitter accounts, and to select which news media and information categories they wish to follow, thereby possibly getting around social media platform algorithms to take command of their personal information flow.
The coolest part is that users can change their information priority settings at any time. For instance, they can use the simple Settings function to see more articles with a female perspective today, and see more social media posts from viewpoints closest to their own tomorrow before getting their fill of unfamiliar perspectives the next day. The platform even features an “Internet dummy filter” that performs semantic analysis to let users adjust where to look between the extremes of expression depending on how bold they are feeling on any given day.
The technological underpinnings of Gobo involve a self-learning robotic mechanism that analyzes every post that appears on Facebook or Twitter subscribers’ feed. Each one has a link that explains “why you are seeing this post” to give social media users maximum awareness and control.
Third: Professional Media can Strive for Balance
In a digital age defined by the domination of technology platforms, conventional mainstream media is often seen as archaic and backwards.
Yet, as a host of issues emerges with social media sites, professional media can take on greater responsibility to try and balance out the skewed Internet imbalance.
For instance, The Guardian, which tends to favor a liberal perspective, runs a weekly feature every Monday called Burst Your Bubble, in which the editors select articles worth reading that are written from a conservative viewpoint. In its Right and Left feature, curated by its digital editor, the New York Times compiles a cheat sheet of opinions from right and left perspectives on a given issue. Recently they have all happened to select pro-gun articles to give readers that favor stricter gun control insight into how those opposed to gun control think.
The Washington Post website takes this one step further. Using AI technology in place of editors, a Counterpoint column is automatically generated with a different perspective than that of the column or commentary the reader is currently reading. The idea is to remind readers of “other ways of thinking” and keep readers from exposed only to information that they favor.
Fourth: Use Technology to Help Report on Complex Issues
The mainstream media’s efforts to present opposing viewpoints and break down opinion comfort zones are just a first step, merely serving the traditional spirit of impartiality and objectivity. One further obstacle confronting social media platforms’ logic is the current reality of diluted attention spans. In other words, how to save readers time so that they can absorb the most information in the shortest amount of time.”
The Guardian recently unveiled a new smart articles function. Modeled after original online media like Circa and Vox, it partitions news elements into sections and visual slices to help make complex issues clear and easily understandable. It also strengthens news elements that are lacking or advance unclear arguments based on readers’ feedback. What’s more, the algorithm displays the latest news developments in line with users’ prior browsing history.
The BBC’s website utilizes a different kind of AI technology. They set up a chat bot module that allows journalists to set various simple key questions to accompany each news article and automatically link to other articles. While reading an article, readers can click on any issue for the chat bot to answer, such as North Korean nukes or the Trump administration’s first-year policies, to fill users in with background information behind the news at their discretion. The questions and answers then become a constantly updated database that can be recycled for other related reports.
In the age of social media where “readers are also broadcasters,” the key fifth element is perhaps each and every Internet user.
Journalist Frédéric Filloux, a John S. Knight senior research fellow at Stanford University, recently conducted a detailed analysis of social media platforms such as Facebook and YouTube, finding that the sheer scale of these platforms, combined with their business models and technical issues related to their algorithms, bogs them down to the extent that they are ill-equipped to resolve such issues as disinformation and echo chambers. Filloux believes that these Big Tech giants must utilize the collective monitoring of outside forces including the news media and on-line-participants if they are to progressively rectify these abnormalities.
Differing political views are normal in democracies, and Internet debates or arguments are not necessarily bad things. Yet lies and hate only breed division and prevent us from hearing others’ voices. Only when we stand on the foundation of the same facts can we really see differences for what they are, and slowly and laboriously come closer together, free from prejudice.
Translated from the Chinese article by David Toman
Opinion@CommonWealth website is a sub-channel of CommonWealth Magazine. Founded in January 2013 with its main focus on social, humanity and policy issues and opinions, Opinion@CommonWealth is dedicated to building a democratic, diverse platform where multi opinions can be presented.
Currently, there are approximately 100 columnists and writers co-contributing on Opinion@CommonWealth to contemplating and exploring Taiwan's future with the Taiwanese society.