YouTube will require creators to disclose AI-generated content

Follow tech news long enough, and it is noticeable how people gush about how fast artificial intelligence progresses. After all, ChatGPT only came out last year, but it sparked an extremely fast AI revolution worldwide.

This year, the text-generating chatbot transformed into a video-generating one, impressing many and horrifying others. In response, YouTube added new rules for AI-generated videos. 

READ: US regulates AI voice calls

The globally acclaimed video hosting platform announced it will require creators to disclose whether they put AI-generated assets in their clips. 

How will YouTube’s new rules work?

Free stock photo from Pexels

The official YouTube support page says the platform will require creators to specify whether viewers might mistake realistic content for a real-life person, place, or event. 

Specifically, content makers must disclose if they used generative AI or other synthetic media to make them. Its Help Center shares the following examples:

Meanwhile, YouTube doesn’t require creators to disclose content that is unrealistic or animated, such as special effects. Here is another list of examples from the Help Center:

AI disclosures will become an additional step for uploading content, but the other steps will remain. Here’s how uploading YouTube content will work following this development:

  1. Go to YouTube Studio.
  2. Follow the steps to upload content.
  3. Select Yes to the disclosure questions in the Details section under Altered Content.

If a creator made a YouTube Short using the platform’s Dream Track, Dream Screen, or other AI effects, they don’t need to disclose any AI-generated elements.

Disclosing AI content will put the “Altered or synthetic content” label on the video. YouTube says everyone will see these labels in the weeks ahead.

Read more...