Instagram (Finally) Bans Graphic Images Of Self-Harm
The social media platform made the decision after objections were raised in Britain, following the suicide of a teen whose father said the photo-sharing platform had contributed to her decision to take her own life.
Instagram chief Adam Mosseri said the platform is making changes to its content rules "to keep the most vulnerable people who use Instagram safe."
"We need to do more to consider the effect of these images on other people who might see them. This is a difficult but important balance to get right," he said in a statement on Thursday.
The company said it won't allow graphic images of self-harm. It also said it won't show non-graphic, self-harm related content in search or through hashtags, nor will it recommend such content to its users.
However, Mosseri said the company won't ban non-graphic, self-harm content entirely because "we don't want want to stigmatise or isolate people who may be in distress and posting self-harm related content as a cry for help."
The call for changes was backed by the British government after the family of 14-year-old Molly Russell found material related to depression and suicide on her Instagram account after her death in 2017.
Her father Ian Russell said he believed social media apps "helped kill my daughter". He wrote to Facebook (which owns Instagram), Snapchat, Pinterest, Apple and Google, pleading with them to act.
If you need help in a crisis, call Lifeline on 13 11 14. For further information about depression, anxiety, suicidal thoughts or self-harm, contact beyondBlue on 1300 22 4636 or talk to your GP, local health professional or someone you trust.