Instagram knows that nothing is more important than the safety of the people who use its platform, especially those who are most vulnerable. For instance, self-harm and suicide are some complex topics that people care deeply about. Such issues are complicated and are difficult to understand. While there are several options regarding how to approach them, there is no denying that such topics matter a lot.
It is difficult to imagine what people who are going to such difficult situations, as well as their friends and families, are going through. Instagram also recognizes the fact that it is not enough to keep such people in its mind merely. This platform is also aware that it owes its popularity to everyone who uses the platform. These users include those who may be at risk of self-harm and suicide. For this reason, the platform wants to do everything to keep its users safe.
Two things are right about all online communities. These truths conflict with each other. The first truth is the tragic reality that there are young people who are negatively influenced by what they see online. This might result in these young people to hurt themselves.
At the same time, there are a lot of young people who are turning online to get support from the struggles that they are having. For instance, some people share images of their healed scars and the stories behind them. Some are talking about how they recovered from eating disorders. Most of the time, these online support networks are the only way that those who have the same struggles can share their experiences.
The social media giant wants to keep the tricky balance between allowing individuals to share their mental health experiences while protecting others from getting exposed to harmful content. To do such, it consulted experts from academics and health organizations. These organizations include the Samaritans in the UK and the US’ National Suicide Prevention Line.
Moreover, Instagram emphasized that they understand how content that might be helpful to some might be harmful to others. The same image that might help people with mental health struggles might also be the same image that could trigger them. This is the reason why Instagram does not allow people to share media that promotes or encourages suicide and self-harm.
This social media platform has also strengthened its approach to content that is related to self-harm and suicide. Last February, it prohibited the sharing of graphic images that show self-harm. It has also been built a new technology that would find and act on such type of content. Instagram also worked to make sure that such content, as well as those accounts that are sharing it, are not recommended.
As a result, the platform to have been able to act twice as much content as before. Three months after the platform implemented its policy change, it was able to remove and reduce the visibility of sensitive screens to over 834,000 pieces of content. The platform has also successfully found 77 percent of the said content even before it was reported to it. Still, Instagram is aware that its work is never done even with the evident progress that the new policy showed. Their growing follower base on Instagram is evident that it isn't exactly necessary either.
During the months that passed, Instagram has further expanded the policy of the platform. Its aim is to more types of suicide and self-harm content from circulating within the forum. Now, fictional representations of suicide and self-harm are already prohibited on the platform. These fictional depictions include memes and media from comics, drawings, or films that use graphic imagery.
Media that does not show self-harm and suicide but includes associated materials or methods will also be removed from the platform.
Accounts that share content similar to what is mentioned above will not appear in search results and recommendations such as Explore.
Besides, the platform will send more resources of localized helplines to people. These helplines include the Samaritans and
PAPYRUS in the UK. On the other hand, The Trevor Project and the National Suicide Prevention Lifeline are the helplines for the US.
There is no doubt that these complicated issues cannot be solved alone by a single company or a set of policies. This is the reason why the platform consults experts to know whether allowing such content would be helpful or not.
Moreover, the expert that Instagram consulted said that letting people share their stories can be a significant means of support. They also said that preventing people from sharing their stories about such topics could stigmatize mental health. Apart from that, it could also hinder others from identifying and responding to a cry for help.
Furthermore, the social media giant is aware that getting the right approach is not achievable through a single policy change or a one-time update. So, to make sure that it is doing the right steps, Instagram meets with academics and experts about self-harm and suicide every month. It is also working hand in hand with MIND, a Swedish mental health organization. This partnership aims to understand how social media and technology play a role in the lives of people.
Instagram is also working with the Samaritans in the UK. They aim to implement an industry-wide effort that would shape new guidelines that would help those who are in distress.
Outside Europe, Instagram also has additional technology. This helps them find those who require help and support. The platform wants to bring this technology to Europe. However, there are essential considerations under EU law. For this reason, the platform is now working with its European regulators to make its aim possible.
Furthermore, the social media platform remains faithful to its aim of keeping everyone safe on Instagram. Protecting people will be done while allowing them to get the support that could make a difference at times when they most need it.
Date: September 1, 2020 / Categories: News, / Author: Joy P