/biased-bots

Primary LanguageJavaScript

Biased Bots

Research project at The University of Melbourne.

General information

In today's media landscape, emotionally charged topics including issues related to politics, religion, or gender can lead to the formation of filter bubbles, echo chambers, and subsequently to strong polarization. An effective way to help people break out of their bubble is engaging with other perspectives. When caught in a filter bubble, however, it can be hard to find an opposed conversation partner in one's social surroundings to discuss a particular topic with. Also, such conversations can be socially awkward and are, therefore, rather avoided. In this project, we aim to train different chat-bots to become highly opinionated on specific topics (e.g., pro/against gun laws) and provide the opportunity to discuss political views with a highly biased opponent.

Learn more about the project in the following publications:

  • Tilman Dingler, Ashris Choudhury, and Vassilis Kostakos. 2018. Biased Bots: Conversational Agents to Overcome Polarization. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers (UbiComp ’18). Association for Computing Machinery, New York, NY, USA, 1664–1668. DOI:https://doi.org/10.1145/3267305.3274189

Author

Used libraries and utilities

License

Biased bots are published under the MIT license and GPL v3.