Social media is a double-edged sword; a tired, worn-out sentiment but one that continues to ring true nonetheless. More than a decade after the dawn of the social media era and we are still only beginning to understand a fraction of such a widespread platform's potential. The ability to connect with so many other individuals has allowed us the power to discover new communities, amplify marginalized voices, and organize whole new movements but has also created a feedback loop that has amplified fringe views, brought forth a new mob mentality, and had the power to influence elections. Social media is a tool and its power ultimately lies in how it is used, but now as generations raised on the platforms continue to come of age it becomes imperative that we understand its many perils.
Following the recent suicide of 14 year-old Molly Russell, the United Kingdom is taking a good long look at its relationship to Instagram. According to a recent BBC documentary, Russell's parents say that their daughter showed no obvious signs of mental health issues leading up to her death but after the fact discovered through her social media history that she had been viewing pages that depicted self-harm.
In a new case, the Russells have now sought to hold tech companies responsible for their daughter's death claiming that the community Instagram fostered contributed towards a "fatalistic" view of depression rather than a supportive one. Pointing to how "algorithms can push negative content" and possibly contribute to an increased risk for self-harm or suicide, the case is a grim illustration of the destructive effect social media's echo chamber can have.
Now the UK's health secretary, Matt Hancock, is exploring ways to hold companies like Instagram accountable for harmful content. "Lots of parents feel powerless in the face of social media. But we are not powerless. Both government and social media providers have a duty to act," Hancock explains to the BBC. The health secretary has since sent a letter to Twitter, Snapchat, Pinterest, Apple, Google and Facebook to call for urgent changes, adding "Pinterest has a huge amount to answer for."
Instagram has responded calls for a purge of harmful content by pointing out that they "don't remove certain content" because it can be helpful to those who may be dealing with similar mental health issues to share their stories and that it can aid in the recovery process. Instagram already has worked with the National Suicide Prevention Lifeline and National Eating Disorder Association to create pop-ups for potentially triggering hashtags, as well as an anonymous reporting system, that provides at-risk users with resources such as local support hotlines.
And while there have been many studies that have shown a connection between mental health and social media, a blanket ban of harmful content is no substitution for a robust state-supported mental health infrastructure that includes staffed hotlines and adequate beds in hospitals for those in crisis. As we has seen before, purges aren't often well received or executed and there is no promise that one from the UK would be effective. Hancock has stated that "it's not where I'd like to end up" but "ultimately parliament does have that sanction," leaving all cards on the table.
If you are having thoughts of suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255 (TALK). You can find a list of additional resources at SpeakingOfSuicide.com/resources.