"News is a matter of trust. A person who subscribes to a newspaper does not gain such trust on the basis of individual articles but by reading the publication over an extended period of time. This big-picture approach is hardly possible any longer online. Users of social media receive all kinds of individual messages from sites that they liked at some point in the past, and via the friends in their personal network. This puts them in contact with sources that match their interests and confirm their attitudes even though they have virtually no knowledge of the sources themselves and cannot judge the extent to which the claims being made are true. This is dangerous: anyone who places their trust in online channels such as Facebook or Google, which use algorithms to personalise their messages, may well find themselves confronted with some dubious content. However, because each confirms the other, they appear particularly credible. This is how filter bubbles are created.
Fact or fake?
Just how difficult it is to judge the quality of news was revealed in 2014 by a large-scale online experiment (only in German). We showed participants news articles that they were asked to assess with respect to various quality criteria such as comprehensibility, balance and presentation of different sides. The articles were either completely factual or entirely one-sided. We went to considerable lengths to make the differences as clear as possible. Nonetheless, only just half of those surveyed were able to correctly judge the high or low quality of the news articles. What we learned from this is that people trust in the images of media brands that they know – regardless of how these images were formed in their minds. To put it another way, the more positive a person's overall perception of a medium is, the higher that person considers the quality of its news articles to be.
The situation online is far more complicated. There are so many different media brands on the Internet that it is virtually impossible for any one person to know and be able to judge them all. Platforms, video portals, search engines and personalised news apps track the online behaviour of their users and, unnoticed by them, keep refining and further developing their algorithms. The system shows us things we are interested in and conceals everything else. In an extreme case we may not see any new content or articles that question our own world view.
Filter bubbles differ from one individual to another
Facebook users have a tendency to overestimate the weight of their own opinions to a greater extent than people would do normally. They have the feeling that they are speaking on behalf of the majority; they become more active and more willing to speak out. How pronounced a person's filter bubble will be depends on how intensively that person uses algorithmically personalised channels to obtain information. We discovered that Germans on average get 25 percent of their news content from algorithmically personalised sources. Conversely, this means that they obtain three quarters of their information from non-personalised media such as television, radio, newspapers, or online news portals.
It is mid-educated and somewhat older people who give us greatest cause for concern. Quite a few of them are disillusioned by politics and believe that they can change things online. They particularly often seek out information in algorithmically personalised channels, where they are presented largely with content and opinions that confirm and reinforce their existing attitudes. As such, we are not (yet) worried about society as a whole becoming polarised. Rather there is a smallish group of dissatisfied people who trust neither politicians nor journalistic media and who are displaying an increasingly extreme frustration with politics.
Nobody wants to be manipulated
The problem is that many people have no idea how much false information and algorithms there are online, or how they work. Based on the assumption that nobody wants to be manipulated by fake news, my hopes are pinned on full and extensive reporting of the issue. How do algorithms work, which rules dictate which content is made available to me, and which actors pursue which interests? Independent journalism has the power to sensitize and educate, and this is precisely what our society so desperately needs."
Professor Wolfgang Schweiger
Professor Schweiger has held the chair in communication science at the University of Hohenheim (Stuttgart) since 2013, focusing particularly on interactive media and online communication. This discipline explores the effects of online communication on individuals, organisations, media, and society. 2019 saw the publication of his latest book, entitled "Algorithmisch personalisierte Nachrichtenkanäle – Begriffe, Nutzung, Wirkung" (Schweiger, Weber, Prochazka, Brückner).www.kowi.uni-hohenheim.de > Professor Wolfgang Schweiger