New Artificial intelligence (AI) Software can automatically Detect new Child Sexual Abuse Photos and Videos Online Networks

New Artificial intelligence (AI) Software can automatically Detect new Child Sexual Abuse Photos and Videos Online Networks
Published on

London, December 4, 2016: Artificial intelligence software developed by the scientists can automatically detect new child sexual abuse photos and videos online networks and it help to prosecute offenders.

According to PTI report, "There are hundreds of searches for child abuse images every second worldwide, resulting in hundreds of thousands of child sexual abuse images and videos being shared every year."

Researchers said, "The people who produce child sexual abuse media are often abusers themselves, including those from Lancaster University in the UK."

NewsGram brings to you current foreign news from all over the world.

"Spotting newly produced media online can give law enforcement agencies the fresh evidence they need to find and prosecute offenders," they said.

The new toolkit automatically detects the new or previously unknown child sexual abuse media with the use of artificial intelligence.

Author Claudia Peersman from Lancaster University said, "Identifying new child sexual abuse media is critical because it can indicate recent or ongoing child abuse."

NewsGram brings to you top news around the world today.

Peersman said, "And because originators of such media can be hands-on abusers, their early detection and apprehension can safeguard their victims from further abuse."

To help law enforcement agents there are already a number of tools available to monitor peer-to-peer networks for child sexual abuse media, but they mainly rely on identifying known media.

Therefore, these tools cannot assess the thousands of results they retrieve and are unable to spot new media that appear.

Check out NewsGram for latest international news updates.

According to PTI, "The Identifying and Catching Originators in P2P (iCOP) Networks toolkit uses artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media."

In an intelligent filtering module, the new approach combines automatic filename and media analysis techniques. The software can now easily identify the new criminal media and differentiate it from other media being shared, like adult pornography.

Since the new system can actually reveal who is sharing known child sexual abuse media, and also will show other files that are shared by those people, it will be very useful to law enforcers.

– by NewsGram team with PTI inputs

logo
NewsGram
www.newsgram.com