More and more institutions have started to see the benefits of integrating artificial intelligence into their operations. Not only does it help speed up various processes, it also frees up personnel to handle other tasks.
The British Metropolitan Police Service (Met) use AI to scan for images of child abuse on suspects’ phones and computers. But the AI is not perfect and has been known to flag pictures of deserts as nude photos, reports The Telegraph.
“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” said Mark Stokes, head of digital and electronics forensics.
He added, “For some reason, lots of people have screen-savers of deserts and it [the AI] picks it up thinking it is skin color.”
Met decided to let AI handle sorting through illicit photos to free police officers from psychological trauma. According to Met, forensic specialists continually suffer psychological strain from looking through indecent images throughout their career.
“You can imagine that doing that for year-on-year is very disturbing,” said Stokes.
Met has drawn up plans to receive help from Silicon Valley companies like Google, Microsoft and Amazon to train their AI for better detection of abusive and indecent images. Alfred Bayle/JB
RELATED STORIES:
Virtual girlfriend ‘Vivi’ aimed to help increase Chinese VR headset sales
AI-written ‘Harry Potter’ fanfiction attempts to capture Rowling wizarding world
‘Grinch bots’ may steal Christmas by snatching up prized toys