Facebook to take safety measures against Online Suicide Challenges

Facebook to take safety measures against Online Suicide Challenges
Published on

New Delhi, India, September 8, 2017: At a time when reports of suicides linked to the Blue Whale challenge internet game are sending shock waves through the country, Facebook on Friday said it is working with suicide prevention partners to collect phrases, hashtags and group names associated with online challenges encouraging self-harm or suicide.

"We offer resources to people that search for these terms on Facebook," the social media giant said.

The Blue Whale challenge is said to psychologically provoke players to indulge in daring, self-destructive tasks for 50 days before finally taking the "winning" step of killing themselves.

Facebook said it also removes content that violates our Community Standards, which do not allow the promotion of self-injury or suicide.

Starting on World Suicide Prevention Day on September 10, Facebook said it would also connect people in India with information about support groups and suicide prevention tools in News Feed.

"Facebook is a place where people connect and share, and one of the things we have learnt from the mental health partners and academics we have worked with on this issue, is that being connected is a protective factor in suicide prevention," said Ankhi Das, Director of Public Policy for Facebook in India, South and Central Asia.

Additional resources about suicide prevention and online well-being will also be added to its Safety Center, Facebook said.

With these resources, people can access tools to resolve conflict online, help a friend who is expressing suicidal thoughts or get resources if they are going through a difficult time.

"We care deeply about the safety and millions of people in India who use Facebook to connect with the people who matter to them, and recognize there's an opportunity with these tools and resources to connect someone who is struggling with a person they already have a relationship with," Das said.

Facebook's Safety Center also offers guidance for parents, teenagers, educators, and law enforcement officials to start a conversation about online safety, with localized resources and videos available.

People can also reach out to Facebook when they see something that makes them concerned about a friend's well-being.

"We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports of suicide. For those who reach out to us, we provide suggested text to make it easier for people to start a conversation with their friend in need," Facebook said.

"We provide the friend who has expressed suicidal thoughts information about local help lines, along with other tips and resources," it added. (IANS)

logo
NewsGram
www.newsgram.com