Technology
Instagram expands self-harm ban to memes, cartoons
San Francisco, Oct 28
To check more types of self-harm and suicide content, Facebook-owned Instagram has extended ban on graphical self-harm imagery that includes memes and cartoons.
It's Instagram's response to the public outcry over the death of British teenager Molly Russell who killed herself in 2017 after viewing graphic content on the photo sharing platform.
"We have expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery.
"We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods," Adam Mosseri, Head of Instagram, wrote in a blog post on Sunday.
According to Instagram, nothing is more important to it than the safety of people who use the platform, particularly the most vulnerable.
"Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like 'Explore'. And we'll send more people more resources with localised helplines, like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the US," Mosseri said.
After Russell's death, her family discovered she had been "suggested" disturbing posts on Instagram and Pinterest about anxiety, depression, self-harm and suicide, according to reports.
It's Instagram's response to the public outcry over the death of British teenager Molly Russell who killed herself in 2017 after viewing graphic content on the photo sharing platform.
"We have expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery.
"We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods," Adam Mosseri, Head of Instagram, wrote in a blog post on Sunday.
According to Instagram, nothing is more important to it than the safety of people who use the platform, particularly the most vulnerable.
"Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like 'Explore'. And we'll send more people more resources with localised helplines, like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the US," Mosseri said.
After Russell's death, her family discovered she had been "suggested" disturbing posts on Instagram and Pinterest about anxiety, depression, self-harm and suicide, according to reports.
1 minute ago
Cambodia's parliament adopts law on combating online scams, with life imprisonment for scam bosses
3 minutes ago
Trump claims "regime change" in Iran has already occurred amid war in West Asia
5 minutes ago
One Indian killed as Iranian strikes hit Kuwait's power, water desalination plant
7 minutes ago
Netanyahu grants "full, immediate access" to Church of Holy Sepulchre after police block Latin Patriarch's entry for Palm Sunday
11 minutes ago
Indian-American Congressman blames House GOP for shutdown chaos
13 minutes ago
US shutdown standoff hits airports, pay row deepens
15 minutes ago
US lawmakers clash over troops, war powers
16 minutes ago
Trump weighs Iran strike as talks begin
18 minutes ago
Trump says Cuba ‘will be next’ as crisis deepens
18 minutes ago
Trump signals possible Iran deal amid escalation
19 minutes ago
Trump details White House Ballroom with security features
20 minutes ago
Rising mental health treatment in US paired with worsening outcomes, experts warn
21 minutes ago
Trump pushes voter ID, attacks mail-in ballots
