Tech firms unite against deceptive AI in elections

Tech firms unite against deceptive AI in elections

/ 05:42 PM February 19, 2024

Artificial intelligence will usher in a new technological revolution that will enhance our productivity and push our potential further than ever. However, AI can also become a powerful tool for spreading falsehoods and disrupting essential systems. Specifically, malicious actors could use AI to share fake disparaging content against election candidates.

Elections are the backbone of democracies because they allow us to change our societies based on a majority vote. That is why the Munich Security Conference announced the “Tech Accord to Combat Deceptive Use of AI in 2024 Elections.” It involves 20 prominent technology companies working against AI-generated misinformation.

How will tech firms fight against AI in elections?

Tech firms unite against deceptive AI in elections
Free stock photo from Pexels

The Tech Accord to Combat Deceptive Use of AI in 2024 Elections involves 20 large tech corporations that will fight against AI-generated election misinformation. These include Adobe, Google, IBM, and OpenAI. They outlined eight commitments:

ADVERTISEMENT
  1. Developing and implementing technology to mitigate risks related to Deceptive AI Election content, including open-source tools where appropriate.
  2. Assessing models in scope of this accord to understand the risks they may present regarding Deceptive AI Election Content.
  3. Seeking to detect the distribution of this content on their platforms.
  4. Seeking to appropriately address this content detected on their platforms.
  5. Fostering cross-industry resilience to deceptive AI election content.
  6. Providing transparency to the public regarding how the company addresses it.
  7. Continuing to engage with a diverse set of global civil society organizations, and academics.
  8. Supporting efforts to foster public awareness, media literacy, and all-of-society resilience.

READ: Mis/disinformation tagged as No.1 threat to global stability

FEATURED STORIES

These will cover AI-generated audio, video, and pictures that change the appearance, voice, or actions of political candidates and other key figures. Also, its scope will cover AI-generated content that disseminates information about voting logistics.

What are the accord’s critical pillars?

Microsoft’s Vice Chair and President, Brad Smith, wrote a blog post that explains these accords further. He listed and elaborated on three critical pillars:

  1. First, the accord’s commitments will make it more difficult for bad actors to use legitimate tools to create deepfakes. It will deploy safety measures in AI services and AI-generated content, such as metadata and watermarking.
  2. Second, the accord brings the tech sector together to detect and respond to deepfakes in elections. Microsoft will deploy its AI for Good Lab and Threat Analysis Center to detect AI in elections better than before. Also, the Microsoft-2024 Elections webpage will let political candidates report on their AI deepfakes.
  3. Third, the accord will help advance transparency and build societal resilience to deepfakes in elections. Microsoft will publish an annual transparency report and support public awareness campaigns to help people spot deceptive AI in elections.

READ: China suspected of using AI on social media to sway US voters, Microsoft says

It might seem artificial intelligence is only harmful to societies, but it can improve government services. Check out my other article to learn more about this AI application.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

TOPICS: AI, elections, misinformation
TAGS: AI, elections, misinformation

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.