Newsroom

Online Misinformation Inoculation Trialed

LONDON / AGILITYPR.NEWS / January 18, 2024 / New behavioural research from Thinks Insight and Strategy makes clear the threat to the UK General Election from disinformation and what can be done about it.

 

Thinks Applied Behavioural Science Team ran an online Randomised Controlled Trial (RCT) with 1,650 regular users of social media to identify the threat to the UK from election-related disinformation (content designed to undermine trust in the UK’s electoral systems and processes, or its political leaders and parties), and test potential ways to combat the threat using a fully interactive simulation of a social media site. We used focus groups and nationally representative polling to further explore the results from the RCT. 

 

Key findings: 

1) Election-related disinformation poses a threat to the UK’s forthcoming general election 

  • Experiment participants were equally likely to amplify electoral disinformation as other types of false information.  
  • This readiness to engage with, and therefore amplify and spread, disinformation questioning the integrity of our electoral system and politics is supported by nationally representative survey findings showing that large minorities are open to the idea that UK election results should be questioned. There is also a meaningful level of belief that our elections may be rigged. 
  • Asked to say which statement came closest to their view:  
  • 38% of UK adults selected a statement saying “it is acceptable to question the validity of the results of elections in the UK”, while 54% selected the statement “it is important that we respect the results of elections in the UK” 
  • 30% selected “elections in the UK are often manipulated and/or rigged” while 60% selected “elections in the UK are free and fair” 

 

2) Deepfake content of politicians appears particularly powerful 

  • An audio deepfake purporting to show Keir Starmer berating his colleagues was the most amplified of all the election-related disinformation tested, and second most amplified of all the disinformation tested. 
  • Over 40% of participants in the control arm of the experiment amplified the deepfake by reacting in the social media simulation. 

 

3) Playing a 45 second ‘inoculation game’ meaningfully and significantly reduces amplification of disinformation 

  • A third of participants in the experiment were asked to complete a short six question quiz game designed to help identify the signs of disinformation.  
  • Compared to the control arm of the experiment, inoculation reduced the odds that a participant would react to disinformation by 42% 
  • Inoculation appears powerful against deepfakes, with rate of reaction to the Keir Starmer deepfake falling by over 10 percentage points in the inoculation arm compared to the control. 

 

4) An optimised inoculation game could have real positive impact ahead of the 2024 general election. But time is running out. 

  • 34% of the UK public are open to playing a short inoculation game to improve their ability to spot election disinformation. 
  • The game tested here was based on previous versions tested by academics and our behavioural team against disinformation in general. It is possible that the impact of an inoculation game ahead of the election could be greatly enhanced if it were further developed to focus solely on election-related disinformation, or even solely on deepfake content. 
  • In the focus groups, participants discussed how the game could have greater impact if it were more competitive and shareable. 

 

Max Mawby, Founder, Applied Behavioural Science Team at Thinks Insight & Strategy said: "This research suggests that election-related disinformation is likely to be spread by the UK public at a rate similar to other disinformation. It also demonstrates that the kind of simple inoculation that we know (from previous studies including my own) works on general disinformation, also works on election disinformation specifically. It'd be great to see something like this developed and distributed ahead of the General Election." 

 

Ben Shimshon, Co-Founder & CEO, Thinks Insight & Strategy said: "Max and the team have delivered a really important result here. Making clear that there is a genuine threat to the UK from election disinformation, but also showing clearly that even the most pernicious content, such as deepfakes, can be combatted through simple interventions." 

 

Notes to editors: 

 

About this research and project design

 

Intervention design of two interventions to combat the spread of disinformation online 

We designed two interventions drawing on our Behavioural Team’s experience with the approaches most commonly used in the tech industry, and the approach most commonly proposed by academics: 

  • Subject flagging – some posts on a social media site can be flagged with warnings based off the subject or topic of content e.g. ‘Artificial Intelligence’ to warn participants that the content could contain disinformation. This is similar in style to efforts by many of the social media platforms themselves. 
  • Inoculation – a short six question interactive quiz with feedback designed to train the participant to spot disinformation and understand manipulation techniques (e.g. conspiracy, impersonation, trolling, emotionally triggering language etc…). This is inspired by academic research conducted by many academics including most notably Gordon Pennycook, David Rand and Sander Van Der Linden. 

 

We tested these interventions in an experiment with 1,650 participants conducted between 16/11/23 and 30/11/23. 

  • The RCT deployed a fully interactive simulation of a social media site – participants could react (for example liking or loving a post), comment, share and report. We were delighted to work with the experimentation platform Gorilla to build and deploy this simulation. 
  • Participants were exposed to 15 genuine posts and 15 disinformation posts. Our primary measure was how likely respondents were to amplify the content by clicking one of the ‘reactions’ (like, thumbs up, frown, etc).  
  • The experiment focused in on the effectiveness of the current leading type of choice architecture intervention, flagging or tagging posts (as practised by Meta/X/TikTok ) and inoculation, or pre-bunking, of reducing interactions that spread mis/disinformation about the democratic process online, compared to the absence of interventions. 
  • We tested a variety of disinformation including content covering the electoral process, national statistics and brands, as well as varying tones (seriousness, humour, anger) and political leaning (left – right). Mis/disinformation posts were inspired by real life stories with original sources anonymised/adapted.   
  • Participants were randomly assigned to one of three groups: 
  • Control - saw the 30 posts with no additional information. 
  • Subject flagging – some of the posts were flagged with warnings based off the subject or topic of content e.g. ‘Artificial Intelligence’ to warn participants that the content could contain disinformation. 
  • Inoculation – in which participants completed a short six question interactive quiz with feedback designed to train the participant to spot disinformation and understand manipulation techniques (e.g. conspiracy, impersonation, trolling, emotionally triggering language etc…). 


We then further explored the context and results of our RCT in:  

  • 2 x focus groups with 6 participants in each  
  • Conducted online, lasting 75 mins  
  • Recruited from two locations (London and Stoke upon Trent)  
  • 1 x group of election trusters (agree that elections are free and fair in the UK, election results should be trusted and [another statement])  
  • 1 x group of election sceptics (agree that elections are often manipulated or rigged, it’s acceptable to question election results in the UK and [another statement]) 
  • 2 x nationally representative surveys with samples of min. 2,000 respondents  
  • Survey 1 – fieldwork dates: 13-14 December 2023. 
  • Survey 2 – fieldwork dates: 13-14 January 2024.  

 

END

About Us

Thinks is a global insight and strategy consultancy. Our pioneering research skills and outstanding thinking help our clients make better decisions, drive positive behaviour, communicate more persuasively, and engage more effectively. Thinks’ Applied Behavioural Science team combine scientific rigour with practical and creative thinking to identify, explain and influence the behaviours that matter for your organisation. Thinks is majority owned by an Employee Ownership Trust, an MRS Company Partner, ESOMAR accredited, and a certified B-Corp. 

Contacts

Caitlin Murphy

Communications Manager

hello@thinksinsight.com

West Wing, Somerset House, London, WC2R 1LA

Phone: +44 (0)20 7845 5880

https://thinksinsight.com