Instagram extends ban on self-harm images to drawings and memes

  28 October 2019    Read: 1117
Instagram extends ban on self-harm images to drawings and memes

Instagram has announced plans to extend its ban on self-harm- and suicide-related images to drawings, cartoons and memes.

In February, the social media company stated it was committed to removing all images related to self-harm from the platform.

The move came following the death of 14-year-old Molly Russell in 2017, who took her own life after viewing graphic self-harm content on Instagram and Pinterest.

Her father, Ian Russell, has stated he believes Instagram was partially responsible for her death.

Mr Russell has said he is “really pleased” by the firm’s latest commitment, explaining that he believes his daughter “entered that dark rabbit hole of depressive suicidal content”.

He claimed that the algorithms used by some online platforms “push similar content” on users depending on what they have already been looking.

“I think Molly probably found herself becoming depressed. She was always very self-sufficient and liked to find her own answers,” Mr Russell told BBC News.

“I think she looked towards the Internet to give her support and help. She may well have received support and help, but what she also found was a dark, bleak world of content that accelerated her towards more such content.”

Mr Russell outlined that some of the images Molly came across were “as simple as little cartoons”, while others were “much more graphic and shocking”.

According to Instagram, the social media platform has removed double the amount of content related to self-harm and suicide since the beginning of 2019.

The Facebook-owned company said that between April and June 2019, 834,000 pieces of such content were removed from the platform, 77 per cent of which were not reported by Instagram users.

Instagram chief Adam Mosseri explained the ban on self-harm- and suicide-related drawings, cartoons and memes will be rolled out on Instagram over a period of time.

“It will take time to fully implement... but it’s not going to be the last step we take,” Mosseri stated.

The chief added: “Nothing is more important to me than the safety of the people who use Instagram.”

He said Instagram aims to “strike the difficult balance” between allowing its users to share their mental health stories on the platform “while also protecting others from being exposed to potentially harmful content”.

Andy Burrows, head of child safety online policy at the NSPCC, said that while Instagram has taken “positive” steps towards protecting its users, other companies within the technology industry have been “slow to respond”.

“That is why the government needs to introduce a draft bill to introduce the duty of care regulator by next Easter and commit to ensuring it tackles all the most serious online threats to children,” Burrows said.

 

The Independent


More about: Instagram  


News Line