Category: Misinformation

Potential Source of Harm: Political - Election Interference

Updated April 16, 2024

 

Nature of Harm

Election interference involves the dissemination of AI-generated content (e.g. text, images, video or audio) that alters the behavior of voters in democratic elections. It can involve fabricated or misleading content, as well as content whose source is misidentified.

 

Communication of false information in elections is a problem that has existed for centuries. In the modern era, false information has been communicated at scale without the use of AI. However, the availability of LLM-enabled content generation significantly increases the ability of election attackers to generate misleading content at scale, including persuasively fabricated images, videos and recordings of candidates.

 

Techniques similar to those used for election interference can be also used to alter public opinion for various other political purposes. This is identified separately in our harms register, and we plan to add a separate page on it.

 

For a bit of educational fun, you can take this Misinformation Susceptibility Test from the University of Cambridge.

 

Regulatory and Governance Solutions

Election interference is a very difficult problem to address, because most democratic countries place fairly limited restrictions on access to mass media, including by malicious actors. Regulatory and governance approaches to date have generally a few main forms:

  • obligations on Internet platforms to restrict access to potentially misleading content, such as those under the EU Digital Services Act and UK Online Safety Act 2023
  • obligations on candidates and political advertisers to avoid or identify AI-generated content, such as:
  • general obligations to identify AI-generated content, such as under Article 52(3) of the EU AI Act
  • general prohibitions on AI-generated content and "deepfakes" (AI-generated content not identified as such and typically intended to mislead), such as:
    • US Federal Communications Commission ruling prohibiting use of AI-generated voices in unsolicited "robocalls" (February 2024)
    • US Federal Trade Commission rule prohibiting impersonation of governments and businesses, and proposed rule extending the prohibition to impersonation of individuals (February 2024)
    • those emerging in various US states.

 

However, these measures are unlikely to have significant effects on malicious actors seeking to influence elections.

 

As for other type of AI misinformation threats, it is likely to be crucial that populations are well-educated about the risks of misinformation, and therefore less likely to trust inaccurate content. For example, Bellingcat has produced useful guidance on how to avoid being deceived, and The Future US is planning an advertising campaign to inform US voters about misinformation.

 

Technical Solutions

Technical solutions to content-based election interference are challenging, including because AI-generated content (especially text content) can be very difficult to identify. However, there are some useful corporate initiatives:

  • In February 2024, a group of 20 leading technology companies (including Microsoft, Meta, Google, Amazon, IBM, Adobe, Arm, OpenAI, Anthropic, Stability AI, Snap, TikTok and X) announced an agreement to combat election-related misinformation.
  • Microsoft has announced a set of technical "tools and tactics" for dealing with election interference.
  • Former Google CEO Eric Schmidt has proposed a 6-point plan for fighting election misinformation.
  • Meta in February 2024 announced a team focused on addressing disinformation and other AI-related harms in connection with the June 2024 European Parliament elections.

 

Start-ups have developed tools that assist in identifying misinformation, e.g.:

  • Blackbird AI provides solutions to "identify and measure the manipulation of public and social perception"
  • Tremau provides tools to assist Internet platforms to moderate for restricted content, including content subject to the EU Digital Services Act.

 

Government and Private Entities

Governments and election bodies around the world are taking a variety of steps to address election interference, as are a large number of private entities (including NGOs and political groups).