Headed by University of Liverpool researchers, Elinor Carmi and Elena Musi, ‘Developing fake news immunity’ was a session that revolved around fulfilling the brief of its title. As part of their Economic and Social Research Council funded project, ‘Being Alone Together: Developing Fake News Immunity’, they developed an online chatbot to teach users how to recognise rhetorical strategies that are used to spread misinformation.
The session began with some brief contextual information about misinformation. Using a source from Reuters, it was stated that 59% of misinformation (in the article’s sample of 225 pieces of misinformation) features reconfigured information, ‘where existing and often true information is spun, twisted, recontextualised, or reworked’.
A distinction was also made between misinformation and disinformation: misinformation describes information that can be partially true or false, but is overall misleading and can lead to misinterpretation regardless of intent. However, disinformation describes information that is deliberately misleading or incorrect and is spread with the express intention of deceiving others.
The speakers then described the methodology behind the creation of their Fake News Immunity Chatbot. 1135 news pieces on Covid-19 were collated by the team and analysed for different rhetorical strategies. These were then uploaded to the chatbot, which is to be utilised as an educational tool to help users understand argumentative fallacies and provide an introduction to fallacy theory. This is to ensure that users will be able to better understand and identify any mis/disinformation they may come across in the future.
Participants in the session were then placed into breakout rooms to use the chatbot and discuss their experience with it in their own groups. The chatbot itself was interesting to use and navigate through, and featured three philosophers who ‘systematically worked on searching [for] the truth through questions and answers’ – Socrates, Aristotle and Gorgias.
The chatbot itself is quiz-based, with users being given the link to a news piece to read through. They are then given a series of questions to answer, with these relating to fallacies within the article. Icons of the philosophers mentioned above are shown to the left of the interface and when clicked, they can provide assistance in answering the question being asked. When an answer is incorrect, the chatbot attempts to explain the relevant fallacy or reasoning underpinning the correct answer.
After a short period of time where attendees could interact with the chatbot individually and discuss their experiences in their groups, the session leaders asked for feedback from the participants as a whole.
The feedback given was constructive, with many participants noting the benefit of using a chatbot like this as an educational tool to assist with the learning of critical thinking and evaluation skills.
However, it was pointed out that the amount of information presented at times could be overwhelming, with one user mentioning that a slower introduction to the main section of the chatbot would be helpful. Another attendee suggested an improvement to the user interface of the chatbot by recommending that the news pieces used for the tool should appear within the interface itself, rather than opening in a new tab.
In response to one participant’s question about how to broaden the project’s scope of appeal, the session leaders stated that they wanted to encourage users to share the chatbot among their own networks, utilising a word-of-mouth method. Elena and Elinor also noted that adults were a key demographic to be targeted as they are an often missed group when attempting to teach and spread critical thinking skills.
While concluding the workshop, the session leaders disclosed that they were collaborating with local councils in the UK to connect with gatekeepers of communities such as libraries and other local institutions. This would be done in order to introduce the chatbot as an educational tool for communities, while the pair also expressed their desire to implement the chatbot into conventional educational settings such as schools.
‘Developing fake news immunity’ was definitely an interesting workshop and continued the valuable discussions that have been taking place during MozFest 2021!
For more details about the ‘Being Alone Together: Developing Fake News Immunity’ project, click here.
To access the Fake News Immunity Chatbot, click here.