uni.news
Unmasking Social Bots
Research project at Bielefeld University being funded by Volkswagen Stiftung
Social Bots are believed to have influenced public opinion in the 2016 U.S. presidential election. And before the European Parliamentary election, an EU Commissioner warned of disinformation campaigns being waged with Social Bots. Social Bots are specialized computer programs made specifically to communicate over social media. What impact are Social Bots having on societal discourse? And how might technical systems be used to combat these Bots? Researchers from Bielefeld University, the University of Applied Sciences Bielefeld, and the Australian National University are working on these questions as part of a research project that is being funded by the Volkswagen Stiftung (Volkswagen Foundation).
“Social media can indeed facilitate political dialogue in the public sphere,” says Florian Muhle, member of the Faculty of Sociology. “But the danger with social media is that people exchange ideas with other like-minded individuals, thus reinforcing the opinions they already hold. Automated systems in social media can thus intensify the development of such ‘echo chambers’ that supply users with ideas which affirm their conception of the world.”
“The challenge is to distinguish Social Bots from other accounts. In order to accomplish this in a reliable manner, we work in an interdisciplinary team with members from the social sciences as well as technology and engineering,” says Dr. Ole Pütz, a sociologist who is part of Philipp Cimiano’s research group and who also serves as Coordinator of the U3B project. “This enables us to combine qualitative methods from the social sciences with technical approaches from machine learning. In addition to this, we also use psychological experiments to study the impact of Social Bots,” as Pütz explains. In these experiments, study participants are shown examples of posts that had been posted on Twitter and are tasked with distinguishing whether the posts were made by a human or aBot. They also have to evaluate how convincing or how emotional they find the individual posts to be.
“People react to Social Bots in different ways: some take them seriously, while others see right through them. These tests are meant to help us distinguish different types of users,” explains Dr. Florian Muhle. “This can help to create customized information and assistance for dealing with Social Bots based on the needs of different types of users.”
By better understanding Social Bots and how they operate, the team seeks to develop technical systems that detect Social Bot activity and help to build bridges between fragmented communities on the Internet. Such systems could be used during election cycles, for instance, to warn users before they re-post news stories published by Bots.
The project is called “Unbiased Bots that Build Bridges (U3B): Technical Systems That Support Deliberation and Diversity as a Chance for Political Discourse.” The Volkswagen Stiftung (Volkswagen Foundation) is funding the project through March 2020 as part of its initiative “Artificial Intelligence and the Society of the Future.”
Further Information:
Project website: http://portal.volkswagenstiftung.de/search/projectDetails.do?ref=95920
“Feeding Systems with Arguments” (press release from 20 April 2018): https://www.cit-ec.de/en/news/feeding-systems-arguments