Facebook inc announced that the platform will ban images relate to self-harm on its platform in a bid to fight against the increasing suicide rates. There have been criticisms regarding how the platform lets people publish violent and dangerous content including suicide pics.
Facebook is tightening its policy and will also moderate the same on Instagram to ensure that it does not show up on the explore section in the app on both mobile and computers.
This announcement came on World Suicide Prevention Day during the national suicide prevention week. Similarly, social media platform Twitter Inc announced that content related to self-harm will not be regarded as abusive to reduce the stigma around suicide.
There were multiple reports on how contents related to suicide were published on Facebook. Several youngsters published live videos of themselves committing suicide and their last words, notably a case in India where a young man posted a video of himself jumping from a balcony to his death after citing his last words.
Similarly, a girl posted a poll on Instagram asking her followers if she should kill herself or not. The majority voted yes and she committed suicide.
Facebook will have a team of moderators who will check contents including live videos of suicide.
World Health Organisation reported that over eight million people die due to suicide and thrice as many people attempt to do the same.
Facebook has already launched helplines to help people struggling with mental health issues in response to queries involving the term “suicide.”
There are plenty of Facebook groups for people who are suicidal and have the tendency to commit the same.
Facebook is one of the most popular social networking platforms in the world and the authorities have tried to combat serious issues including cyber violence and hate speech.