Instagram Will Ban Self-Harm Imagery
Care

Instagram Will Ban Self-Harm Imagery

At this point, it seems to be culturally crystalized that social media is truly evil, and slowly turning us into dysmorphic, clinically depressed narcissists. The catalyst might be a combination of increasing public awareness of the lawlessness of the tech lords running our apps, as well as study after study that reveals the devastating effects of social media on our health and wellbeing.

Instagram and its partners appear to be aware of their precarious reputation, and are taking steps to improve it. Recently, the company behind the app's AI filters announced that it would ban face filters that resemble plastic surgery in order to promote "well-being" and foster a "healthy AR ecosystem for creators and our entire community." Recent reports have linked Snapchat and Instagram filters, as well as FaceTuning apps, to trending rates of cosmetic surgery. In addition, the app has banned the promotion of diet sponcon and advertisements to users under 18.

Now, Instagram is trying to protect users' mental health through a different strategy. This weekend, according to BBC, Instagram announced it will ban all images — including drawings, cartoons, videos and memes — which depict self-harm or suicide.

The policy expands a ban passed in February after the death of British teenager Molly Russell, who committed suicide last year at age 14 after viewing graphic content online. At the time, the UK government threatened to ban Instagram if it didn't address the kinds of harmful content that, according Russell's family report, resulted in the teen's suicide.

Russell's family found their daughter's Instagram and Pinterest full of images and drawings glorifying self-harm and suicide. "I have no doubt that Instagram helped kill my daughter," her father Ian told BBC News, which surveyed images linked by hashtags "#suicide" and "#selfharm" including live videos of cutting and photos of bridges titled "jump."

Instagram cracked down on hashtags, suicide accounts and searchable terms related to self-harm at the time. The new measure targets fictional images, like drawings, memes and cartoons, which were among the content Russell's parents discovered on her social media accounts.

"I think Molly entered that dark rabbit hole of depressive suicidal content," he said according to The Telegraph. "Some were as simple as little cartoons — a black and white pencil drawing of a girl that said 'Who would love a suicidal girl?'. Some were much more graphic and shocking."

According to Instagram, the platform has doubled the amount of self-harm and suicide content since the winter. They've apparently removed 834,000 pieces of content, the majority of which was discovered by the site rather than reported by users.

"It will take time to fully implement," Instagram chief Adam Mosseri told BBC News. "But it's not going to be the last step we take."

Photo via Getty