Exploring Polarization via Online Apps

Exploring Polarization via Online Apps

What?

In 2016, the Wall Street Journal (WSJ) introduced a new program on their website, Red Feed Blue Feed (follow the link to check it out!), in an attempt to reveal to users the political and ideological bubbles that have emerged on Facebook. Referring to a 2015 study of over 10 million Facebook users by Facebook researchers, WSJ gathered over 500 online news sources and classified each as typically being shared by those self-identifying as being ideologically “very liberal, liberal, neutral, conservative or very conservative” on users’ Facebook profiles. With this categorization, WSJ’s program depicts two parallel Facebook news feeds from conservative and liberal sources sharing stories on any one of 8 topics: President Trump, Abortion, ISIS, Healthcare, Guns, Executive Order, Budget, and Immigration.

On the WSJ page hosting this program, the WSJ invites you to “See Liberal Facebook and Conservative Facebook, Side by Side” but also cautions you against immediately assuming that either feed perfectly represents what those of opposite ideological leanings see. Ultimately, the WSJ states that this program was built for those who were curious about how issues were simultaneously depicted by left and right-leaning news outlets.

Around the same time as Red Feed Blue Feed’s introduction, numerous other similar programs and apps were being introduced to the public, each attempting to illuminate its users to media bias, dubious reporting, or, at least, the breadth of political perspectives isolated on social media.

For the next semester, our team can improve upon how users interact and learn from programs/apps like Red Feed Blue Feed and provide that instruction on our site.

Why?

As with the case of Red Feed Blue Feed, the creators of each program/app described below provide little follow-up on the effectiveness on whether their app was informative to their users, decreased polarization, improved understanding and question asking – more over whether their app worked. Further, despite the breadth of research on polarization and effective argumentation, these apps stand alone and don’t point their users to any literature to improve how they communicate and commune with those who are ideologically opposed to them.

How?

As explained above, the WSJ does not attempt to change its users’ minds but instead satisfy the curiosity of those who are interested in seeing a side-by-side comparison of ideological depictions of current events. While the program delivers this feature, it’s not clear how this feature affects users nor how this feature could be used to educate users about polarization.  Related research has indicated that the Filter Bubble-like nature of our newsfeeds that Red Feed Blue Feed exposes is a huge contributor to polarization. However other research suggests that popping people’s bubble by exposing them to new perspectives is not that helpful. Instead of becoming less polarized, exposure to new perspectives often had minimal effect of dispelling polarization and sometimes even made it worse.

To build upon this program and learn more about its effect on users as well as improve upon its use as an education tool, the following are suggested:

Red Feed Blue Feed Use Effect on User Confidence/Polarization

Following a similar research question of Fernbach Et al., we suggest using the Red Feed Blue Feed program as a new variable affecting participant’s confidence and polarization with respect to any of the 8 current event topics covered by the program. Specifically, participants could be asked, either on MTurk or in person, to state their beliefs about one or two of the current event topics covered by the program and rate their overall confidence (on a scale) of their belief regarding the topics. Then, users will be asked to visit Red Feed Blue Feed per the current event topic and read several articles or be directed to the site to use as they please for a limited amount of time. After participants have been exposed to the program, they will then be asked a similar series of questions as before in light of their introduction to Red Feed Blue Feed as well as their understanding of different perspectives on the current event topic and how the program may have affected their beliefs about the current event topic, if at all.

We suspect that exposure to Red Feed Blue Feed will actually strengthen the degree of polarization for participants, increasing their confidence in their own belief and lead to less understanding of others. However, it is possible that such exposure could make participants less confident in their own beliefs and more open to the perspectives provided by others.

Red Feed Blue Feed Education – Identifying Ideology and Bias

The WSJ relies on the Facebook study of over 10 million Facebook users’ self-identified ideological leanings and their sharing habits to determine which new sources belong in the Red and Blue Feeds. One way to help participants learn more about what other traits within an article or news source contributes to it leaning ideologically one direction or another is to provide a brief explainer of the current research on how media sources may be biased and ask participants to identify and classify bias in articles on their own.   Results can either be evaluated by our team or compared to an aggregate of the participants’ choices.

Red Feed Blue Feed Education – Writing Without Bias

Another way that this program could be used to educate users is to follow-up the idea detailed above with an invitation to users to submit their own attempt of an article or lede written without bias. After we’ve collected a sizeable amount of entries, we can then present them to users to vote on whether they believe the article/lede (anonymized) should be categorized as either Red, Blue, or neither. Ideally, entries that are the least polarizing should receive the most votes for being categorized as either or at least as equally red and blue.

What follows from here are other online programs and apps that we can look at to include on the website with relevant polarization research to improve the apps’ ability to less polarization and educate users.

Factitious

Factitious is an online news game invented by the American University Game Lab that tests users’ ability to discern real from fake news. To play, you’ll have to swipe right or left after reading different news excerpts – right if you think the events described are accurate and left if you don’t. Feel stuck? Try getting a hint from app at the bottom of each excerpt to see if knowing the source of the article helps you decide. Then, after each swipe, you’ll get feedback on whether you were correct, and you may also get some tips on what features of the article that could have tipped you off for next time.

What we like about the Factitious:

Through feedback after each swipe, the game can help teach you that it is important to pay attention to tell-tale signs of a true or fake article, like its source (specifically, to identify if a source’s stated purpose is to provide satirical, comedic, or exaggerated stories), whether it includes names and references to statements that can be independently verified, and whether article’s claims seem reasonable and match common sense.

What could be better:

Though playing the game can give you some useful tools to tell whether an article is real or fake, the game can is limited compared to what it could be teaching.

For instance, the game cycles through the same set of articles (in different order) each time the game is played. So, until the creators update the game with new articles (which you’re able to contribute to), you won’t be able to come back to the game for more practice for a while.

Another limitation is that the game doesn’t present news in the same way that you’ll come across it outside of the game (like on Facebook or Twitter posts from friends and family, online). Also, just like how many issues are not always black and white, the articles may not always be completely true or completely false – they might include both real and fake parts or claims that, though correct, only provide part of the story. You also won’t learn much about how to identify political slants, which is often the more troubling component of some truthful news sources that tend to use exaggerated language and leave out key details in order to push a prescribed agenda.

So…

The game’s tips can be useful in that, after applying them, most articles can be clearly identified as real or fake news, though we know news coverage “in the real world” is not so black and white. The game does provide a handy introduction for people who read the news, particularly children, to the complex public discourse on fake news through a fun, simple game experience. However, you shouldn’t expect to be super savvy after playing; the game’s expected effects on how people interact with news media may not be particularly robust, though it is a small step towards equipping readers with valuable insights.

Want to learn more? Here are some articles that you should check out!

How might Factitious help children? Some research suggests that kids are particularly bad at discerning real from fake news. Read a take from NPR.

Why is it important to discern between real and fake news? Though we commonly believe that media has a big impact on what readers believe and interpret about the world, researchers haven’t yet been able to show precisely how much. Here is an article where much of this research recently took off.

 

NRK Comment Test

Last February, Norwegian news outlet NRK introduced a feature for commenting on its online content, a three question multiple choice test about select articles. Commenters must answer three questions about key details of the article before they’re allowed to comment. The intention behind this feature is to improve reader’s comprehension and hopefully base the site’s online conversations on the facts presented in the article rather than readers’ gut reaction to titles. Another cited benefit of the feature is that it requires readers to take more time to think and reflect on the article, hopefully allowing for 15-30 more seconds of reflection before comments are made.

Since the feature’s creation, NRK has not indicated any explicit shift in the tone of comments but also shared the interesting findings that the quizzes were failed approximately 70% of time, and successful attempts at comment quizzes far surpassed the number of comments posted. They also note that due to the extended amount of time it takes some users to complete the test, a brief review of those who commented seemed to indicate that those with the most time on their hands tended to be the ones who left comments.

Some other apps were looking into:

Settle It! From PolitiFact

Settle It! Was created by Bill Adair and introduced as an app in 2012. The app has two features, the first is a search feature to look up the reliability or “truth-o-meter” ratings from the PolitiFact database of sources, pundits, and politicians. The second feature is a game that presents statements pre-politifact-evaluated statements from pundits and politicians that users then rate before seeing the true veracity of the statement. For this game, participants can rack up points and share their results.

Bad News

“In this game you take on the role of fake news-monger. Drop all pretense of ethics and choose the path that builds your persona as an unscrupulous media magnate. But keep an eye on your ‘followers’ and ‘credibility’ meters. Your task is to get as many followers as you can while slowly building up fake credibility as a news site. But watch out: you lose if you tell obvious lies or disappoint your supporters!” Here, the creators of this game believe that the best way to learn to recognize disinformation is to create it yourself. By taking on the role of a fake news-monger, you gain insight into the various tactics and methods used by ‘real’ disinformants to spread their message. This, in turn, builds up resistance against false or misleading information by being presented with a weakened version of a misleading argument before being exposed to the “real” information. You can see this as giving people a kind of “vaccine” against misleading information. If you can recognize it, you can resist it.

Kialo

Kialo is “a debate platform powered by reason. Kialo cuts through the noise typically associated with social and online media, making it easy to engage in focused discussion.” Wehn you visit the site, you are able to view and contribute to ongoing debates on the pros and cons of topics that generally divide the nation, like gun control, abortion, religion, and the economy. For each debate topic, the site hosts a visual map of the argument that you can contribute to or vote for the best arguments and rebuttals. Each pro or con then has its own breakdown of pros and cons that are also user-submitted and voted on.

“You Draw It”

You Draw It, while not an app, is an interactive news article created by the New York Times’ TheUpshot. When you first open this online article, you are interoduced to its topic with some questions: How likely is it that children who grow up in very poor families go to college? How about children who grow up in very rich families? Then the site asks you to answer by actually drawing your guess for different income levels on an interactive chart. Once you’ve finished drawing, the article then changes to first compare your line to the reality for children, and then proceeds to explain theese findings.

 

 

Who?

While advocating for investment in research and policy reform, scientists need a means to communicate why their work deserves to be funded and how their discoveries benefit society. Policymakers and the citizens, on the other hand, need to interpret these claims to inform their choice of policy while advocating what they think is ethical and worthwhile. As is too often is the case, however, political and cultural polarization cloud this relationship and everyone loses. As a part of this Bass Connections Project, How to Ask a Question, Esko, a Masters student in Bioethics and Science Policy, is joining others to better understand the origins of polarization, and, with any luck, its cure.

Kyra Exterovich-Rubin is a third-year undergraduate student studying public policy and philosophy. She hails from Wisconsin, where she first became concerned with empathetic political dialogue. Her interest in empathy and polarization extend into her interest in conflict resolution. She explored this issue in the summer of 2017 through fieldwork in both the Balkans and in Israel-Palestine.

J.J. Moncus is a Junior in Trinity College Studying Science in Mathematics while minoring in Philosophy and Economics

J.J. hopes to one day work in research of economic and political development. He is passionate about using scientific methods to understand what institutions and policies shape positive economic growth, healthy democracies, and fulfilling lives for citizens. Alongside his teammates, he is concerned that the inability to hold productive political discussions with opponents affects our electorate’s ability to make reasoned voting decisions. Whereas a less hostile, more constructive debate culture seems essential for democracy to function well. He hopes to help this Bass Connections team discover the causes of polarized debate and how to mediate tensions.

Specifically, he is serving on various subprojects: First, he is helping conduct the literature review and data management on how often identity markers (race, sex, geography, political affiliation) can serve as good indicators of one’s policy views on various issues. Second, he is coding, preparing, and analyzing data on how the phrasing of questions in a debate can affect the participants’ tendencies toward polarized or non-polarized (and thus constructive) discussion.