San Francisco: In a bid to reduce exposure of minors to self-harming, provocative and disturbing content on its platform, Instagram has launched the “sensitivity screen” feature that blurs questionable pictures and video-thumbnails on the app until the viewer opts in. The new feature — that has already reached users in India — blocks images of cutting and self-harm that could pop-up in search, recommendations or hashtags and influence minors into physical danger, Vogue.co.uk reported on Wednesday.
Adam Mosseri, Head of Instagram, announced the rollout of “sensitivity screens” in an op-ed he wrote for The Telegraph, expressing grief on the suicide of British teenager Molly Russell whose parents blamed the photo-messaging app for exposing their daughter to self-harm and suicidal content.
“We are not yet where we need to be on issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe,” Mosseri wrote. The announcement comes after UK Health Secretary Matt Hancock issued a warning to Instagram-owner Facebook to improve protection for young people on its apps or face legal action.