British police AI keeps tagging desert photos as ‘nudes’

/ 02:11 PM December 19, 2017

INQUIRER.net stock photo

More and more institutions have started to see the benefits of integrating artificial intelligence into their operations. Not only does it help speed up various processes, it also frees up personnel to handle other tasks.

The British Metropolitan Police Service (Met) use AI to scan for images of child abuse on suspects’ phones and computers. But the AI is not perfect and has been known to flag pictures of deserts as nude photos, reports The Telegraph.


“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” said Mark Stokes, head of digital and electronics forensics.

He added, “For some reason, lots of people have screen-savers of deserts and it [the AI] picks it up thinking it is skin color.”


Met decided to let AI handle sorting through illicit photos to free police officers from psychological trauma. According to Met, forensic specialists continually suffer psychological strain from looking through indecent images throughout their career.

“You can imagine that doing that for year-on-year is very disturbing,” said Stokes.

Met has drawn up plans to receive help from Silicon Valley companies like Google, Microsoft and Amazon to train their AI for better detection of abusive and indecent images. Alfred Bayle/JB


Virtual girlfriend ‘Vivi’ aimed to help increase Chinese VR headset sales

AI-written ‘Harry Potter’ fanfiction attempts to capture Rowling wizarding world

‘Grinch bots’ may steal Christmas by snatching up prized toys


TOPICS: AI, Artificial Intelligence, British, desert, forensics, Nude photos, skin color
Read Next
Don't miss out on the latest news and information.
View comments

Subscribe to INQUIRER PLUS to get access to The Philippine Daily Inquirer & other 70+ titles, share up to 5 gadgets, listen to the news, download as early as 4am & share articles on social media. Call 896 6000.

For feedback, complaints, or inquiries, contact us.

© Copyright 1997-2020 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.