British police AI keeps tagging desert photos as ‘nudes’
More and more institutions have started to see the benefits of integrating artificial intelligence into their operations. Not only does it help speed up various processes, it also frees up personnel to handle other tasks.
The British Metropolitan Police Service (Met) use AI to scan for images of child abuse on suspects’ phones and computers. But the AI is not perfect and has been known to flag pictures of deserts as nude photos, reports The Telegraph.
“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” said Mark Stokes, head of digital and electronics forensics.
He added, “For some reason, lots of people have screen-savers of deserts and it [the AI] picks it up thinking it is skin color.”
Met decided to let AI handle sorting through illicit photos to free police officers from psychological trauma. According to Met, forensic specialists continually suffer psychological strain from looking through indecent images throughout their career.
“You can imagine that doing that for year-on-year is very disturbing,” said Stokes.
Met has drawn up plans to receive help from Silicon Valley companies like Google, Microsoft and Amazon to train their AI for better detection of abusive and indecent images. Alfred Bayle/JB
Subscribe to INQUIRER PLUS to get access to The Philippine Daily Inquirer & other 70+ titles, share up to 5 gadgets, listen to the news, download as early as 4am & share articles on social media. Call 896 6000.