Youtube removed over 8 million videos in just 3 months

/ 05:53 PM April 26, 2018

Image: AFP

YouTube has released a transparency report that shows a high number of inappropriate content being uploaded, however automated flagging is speeding up the removal process.

It’s easy for the internet to get cluttered with spam and inappropriate content, which means major clean-up for big internet companies who receive massive amounts of uploads and traffic. YouTube is one of them — the company has had eight million videos removed in three months.


Seeking more transparency and less spam, Google, which purchased YouTube in 2006, has published an update regarding the ongoing removal of content that violates its policy. The company has released astonishing figures, along with a quarterly report on how Community Guidelines are being enforced.

The eight million videos that have been removed from the popular video sharing platform were “mostly spam or people attempting to upload adult content,” according to Google, “and represent a fraction of a percent of YouTube’s total views during this time period.”


Machines were the first to flag 6.7 million videos and of those, 76% were quickly removed before they received a single view. These machines are allowing the company to flag content at scale and they claim the technology is paying off in terms of high speed removals across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).

More than half of all “violent extremism” videos have fewer than 10 views, whereas in the beginning of 2017 that number was eight percent.

Although the deployment of machines may suggest a lesser need for humans, that has not been the case for YouTube. Their systems supposedly rely on human review, and thus the company has been busy hiring.

“At YouTube, we’ve staffed the majority of additional roles needed to reach our contribution to meeting that goal. We’ve also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we’ve expanded regional expert teams,” stated the company’s official blog.

As for this year’s goals, the brand is committed to bring the total number of people working on addressing violent content to the grand total of 10,000 across Google. Furthermore, there are plans to refine reporting systems and add additional data, including data on comments, speed of removal and policy removal reasons.

For anyone interested in reviewing the numbers, here is the transparency report. AB



YouTube toughens rules regarding which videos get ads

Cybercrime website behind 4 million attacks taken down

WATCH: NVIDIA develops AI for reconstructing, editing photos

TOPICS: Community Guidelines, deleted videos, inappropriate content, Youtube
Read Next
Don't miss out on the latest news and information.
View comments

Subscribe to INQUIRER PLUS to get access to The Philippine Daily Inquirer & other 70+ titles, share up to 5 gadgets, listen to the news, download as early as 4am & share articles on social media. Call 896 6000.

For feedback, complaints, or inquiries, contact us.

© Copyright 1997-2020 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.