AI “Sex-Offender Detector” Confuses Deserts for Porn


Over the past year, we have seen a boom in artificial intelligent (AII) technology. From the good (choosing viable IVF embryos) to the bad (designing inspirational posters) to the strange (coming up with quaint – sometimes rude – British place names), its success has been a bit of a mixed bag.

The latest news in the world of AI comes from London, UK, where the city’s Metropolitan Police have announced plans to use artificial intelligence to scan electronic devices for images of child abuse. They say the technology should be ready for use within “two to three years”.

As of right now, however, there is just one small hiccup – the software mistakes desert landscapes for nude photographs. Presumably, it’s because of the color.

The Met already uses a less refined form of image recognition software. The problem is – as Mark Stokes explained in an interview with The Telegraph – while the software is able to pick up certain forms of criminal activity (guns, drugs, and money), it has a much harder time identifying pornographic images and videos. Stokes is the Metropolitan Police’s head of digital and electronic forensics.

This means it’s up to police officers themselves to go through the indecent images and grade them for different sentencing – a grueling and psychologically stressful exercise, especially given the scale of the task. According to The Telegraph, the Met had to search through 53,000 devices for incriminating images in 2016. In February, a leading police officer called the levels of recorded child sex offenses in the country “staggering”.

“You can imagine that doing that for year-on-year is very disturbing,” Stokes told the Telegraph.

Fortunately, technological improvements could mean the task is passed on to inanimate objects, who won’t be psychologically affected by the job. The British police force is currently working with Silicon Valley providers (such as Google and Amazon) to create AI technology advanced enough to identify abusive imagery but there are still a few glitches to smooth over.

“Sometimes it comes up with a desert and it thinks it’s an indecent image or pornography,” Stokes explained at the (ISC)2 Secure Summit in London. Which is problematic because desert landscapes are a popular screensaver and wallpaper choice.

“For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin color.”

Source: The Telegraph UK

%%footer%%