Video-sharing giant YouTube has removed as many as 7.8 million videos between July and September, in its attempt to address violative content on the video platform.
According to YouTube's latest "YouTube Community Guidelines Enforcement" report, 81 per cent of these videos were first detected by machines while out of those detected by machines, 74.5 per cent videos had never received a single view.
"When we detect a video that violates our guidelines, we remove the video and apply a strike to the channel. We terminate entire channels if they are dedicated to posting content prohibited by our Community Guidelines or contain a single egregious violation, like child sexual exploitation," the company said in a statement on Friday.
The company has been using a mix of human reviewers and technology to battle inappropriate content on its platform and in 2017 it started applying more advanced Machine Learning (ML) technology to flag content for review by its teams.
YouTube on a smartphone device. Pixabay
A majority of attempted abuse comes from bad actors trying to upload spam or adult content and over 90 per cent of the channels and over 80 per cent of the videos that were removed in September were for violating YouTube's policies on spam or adult content.
"Looking specifically at the most egregious but low-volume areas like violent extremism and child safety, our significant investment in fighting this type of content is having an impact.
"Over 90 per cent of the videos uploaded in September and removed for Violent Extremism or Child Safety had fewer than 10 views," the company added. (IANS)