Technology
Instagram expands self-harm ban to memes, cartoons
San Francisco, Oct 28
To check more types of self-harm and suicide content, Facebook-owned Instagram has extended ban on graphical self-harm imagery that includes memes and cartoons.
It's Instagram's response to the public outcry over the death of British teenager Molly Russell who killed herself in 2017 after viewing graphic content on the photo sharing platform.
"We have expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery.
"We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods," Adam Mosseri, Head of Instagram, wrote in a blog post on Sunday.
According to Instagram, nothing is more important to it than the safety of people who use the platform, particularly the most vulnerable.
"Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like 'Explore'. And we'll send more people more resources with localised helplines, like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the US," Mosseri said.
After Russell's death, her family discovered she had been "suggested" disturbing posts on Instagram and Pinterest about anxiety, depression, self-harm and suicide, according to reports.
It's Instagram's response to the public outcry over the death of British teenager Molly Russell who killed herself in 2017 after viewing graphic content on the photo sharing platform.
"We have expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery.
"We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods," Adam Mosseri, Head of Instagram, wrote in a blog post on Sunday.
According to Instagram, nothing is more important to it than the safety of people who use the platform, particularly the most vulnerable.
"Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like 'Explore'. And we'll send more people more resources with localised helplines, like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the US," Mosseri said.
After Russell's death, her family discovered she had been "suggested" disturbing posts on Instagram and Pinterest about anxiety, depression, self-harm and suicide, according to reports.

15 hours ago
Pak Army official sparks outrage with throat slit gesture at London protest against J&K terror strike

15 hours ago
Pak Army Chief escalates anti-India rant, rakes up two-nation theory again

15 hours ago
Over two lakh people, top world leaders attend funeral ceremony of Pope Francis at Vatican

15 hours ago
Meeting has potential to become historic, says Zelensky after holding talks with Trump in Rome

15 hours ago
"That was bad one": US President Donald Trump on Pahalgam terrorist attack

15 hours ago
FBI arrests Wisconsin judge for allegedly obstructing immigration agents

15 hours ago
US: Luigi Mangione, accused of killing UnitedHealthcare CEO, pleads not guilty to federal charges

15 hours ago
US: Musk's DOGE slashes USD 400 million in AmeriCorps grants

15 hours ago
Not in favour of war": Karnataka CM Siddaramaiah's stand on Pahalgam terror attack draws BJP crticism

15 hours ago
Vatican City: President Murmu pays homage to Pope Francis

18 hours ago
Pooja Hegde on song ‘Kanimaa’: My social media is filled with people recreating the hookstep

18 hours ago
Unni Mukundan's fan club issues final warning to fan pages spreading false information about him

18 hours ago
‘Aamar Boss’ trailer starring Raakhee Gulzar tells heartwarming story of mother and son