Algorithms to combat hate speech online
"This article was published in our October 2017 newsletter". Sign up here.
Communication in social media has acquired a bitter aftertaste in recent years: users increasingly insult other people or offend entire population groups. Law-makers in Germany are now taking tougher action to combat illegal content such as sedition, harassment and defamation. To this end, they are also calling on the operators of major communication platforms such as YouTube, Facebook and Twitter to follow suit. Since 1 October 2017, it has been their own responsibility to ensure that such instances of hate speech are deleted or blocked within 24 hours of a user complaint. Yet even when hate-filled comments are not illegal, they certainly poison the climate in society.
"It is not a question of censorship"
- © iStock
- Hate-filled comments are disseminated increasingly often and quickly online.
Dr Uwe Bretschneider, a business data processing specialist at Martin Luther University Halle-Wittenberg (only in German), has been monitoring this trend for many years as an expert in cyberbullying. When roughly five years ago he noted an increase not only in personal abuse but also in hate campaigns against groups such as refugees, he decided as part of his doctoral studies to develop a program that helps to filter out such hate-filled comments.
"The program analyses the comments and searches for words and word groups that are filed in a database", explains Bretschneider. What is special about the software is its ability to recognise the context in which the insult is used. How these individual words are linked to persons or groups is described in an algorithm, which allows the software also to detect the person or persons at whom the abuse is directed. "It is not a question of censorship", stresses Bretschneider, "but of understanding how opinions are expressed."
Controlling hate speech
If Bretschneider’s software detects hate speech, the comment in question can either be deleted automatically or submitted to a moderator to be checked. This could help smaller platforms in particular – such as newspaper discussion forums – to work more efficiently and filter out any comments that violate the rules. Nowadays, some editorial offices simply do not have enough staff to handle the deluge of hate-filled comments, with the result that they have had to restrict or even shut down some discussion forums.
Uwe Bretschneider is not yet satisfied with the accuracy of his software. He trawled through Twitter and online forums in search of hate-filled comments himself. "The software missed 40 percent of the comments I discovered", reports the IT expert. He hopes that he will be able to improve his software tool, for example by collaborating with linguistic specialists. The hatred that people display on the Internet cannot be eliminated entirely by such technical solutions – but it can be better controlled.