A YouTube spokesperson told that no form of content that endangers minors is acceptable to them. (Unsplash) 
Science & Tech

YouTube, Telegram respond to IT Ministry’s notice on child sexual abuse material

Google-owned YouTube and encrypted messaging platform Telegram on Saturday responded to the IT Ministry’s notice to remove any kind of child sexual abuse material (CSAM) or face stringent action, saying they have a “zero-tolerance policy” on CSAM and related content.

NewsGram Desk

Google-owned YouTube and encrypted messaging platform Telegram on Saturday responded to the IT Ministry’s notice to remove any kind of child sexual abuse material (CSAM) or face stringent action, saying they have a “zero-tolerance policy” on CSAM and related content.

A YouTube spokesperson told that no form of content that endangers minors is acceptable to them.

“We have heavily invested in the technology and teams to fight child sexual abuse and exploitation online and take swift to remove it as quickly as possible,” the spokesperson said.

“In Q2 2023, we removed over 94,000 channels and over 2.5 million videos for violations of our child safety policies. We will continue to work with experts inside and outside of YouTube to provide minors and families the best protections possible,” the YouTube spokesperson added.

According to the Google-owned platform, it has a strong record of successfully fighting child sexual exploitation.

YouTube is working with the industry by offering expertise and technology (CSAI Match, YouTube’s proprietary technology for combating Child Sexual Abuse Imagery) to smaller partners and NGOs.

YouTube also restricts live features, disable comments and limit recommendations of videos containing minors in potentially risky situations or categorised as “made for kids”.

Encrypted messaging platform Telegram said it is "always committed" to upholding legal and ethical standards on its platform, particularly in this case of addressing issues related to Child Pornography (CP), Child Sexual Abuse Material (CSAM), and Rape and Gang Rape (RGR) content on the Indian internet.

In a statement, Telegram said that in response to reports of CP/CSAM/RGR content, it maintains a “zero-tolerance policy towards any unlawful activities conducted by users on our platform”.

“We take immediate and stringent action as prescribed by the law of the land in response to any such violations. When CP/CSAM/RGR content is reported, we initiate prompt actions to remove the offending material. Our average response time for removal is 10-12 hours, well within the permissible time limit of 24 hours as stipulated by the aforementioned regulations,” said the company’s spokesperson.

Telegram said it also has a specialised team working towards removing all the illegal content that falls under the child abuse category.

“In compliance with Section 79 of the IT Act, Telegram is committed to complying with Section 79 of the Information Technology Act (IT Act) and any related notices issued by State/Union Territory Law Enforcement Agencies (LEAs) for the immediate removal of CP/CSAM/RGR content,” said the company.

The Ministry of Electronics and IT notices were sent to social media intermediaries on Friday, emphasising the importance of prompt and permanent removal or disabling of access to any CSAM on their platforms.

"The rules under the IT Act lay down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms. (Unsplash)

"The rules under the IT Act lay down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms. If they do not act swiftly, their safe harbour under section 79 of the IT Act would be withdrawn and consequences under the Indian law will follow,” said Union Minister of State for Electronics & IT, Rajeev Chandrasekhar.

The ministry warned the three social media intermediaries that any delay in complying with the notices will result in the withdrawal of their safe harbour protection under Section 79 of the IT Act, which currently shields them from legal liability.

X Corp (formerly Twitter) was yet to respond to the Indian government’s notice.

(IANS/SR)

American Children Who Appear to Recall Past-Life Memories Grow Up to Be Well-Adjusted Adults

In the ‘Wild West’ of AI Chatbots, Subtle Biases Related to Race and Caste Often Go Unchecked

Future of Education with Neuro-Symbolic AI Agents in Self-Improving Adaptive Instructional Systems

Lower turkey costs set table for cheaper US Thanksgiving feast this year

Suicide bombing kills 12 Pakistan soldiers