Social media firm Facebook on Wednesday said it has updated its child safety policies and is testing new tools to keep users from sharing content that victimises children. The new policies are applicable to the India market as well and will be implemented in a phased manner.
The platform will remove profiles, pages, groups and Instagram accounts that are dedicated to sharing otherwise innocent images of children with captions, hashtags or comments containing inappropriate signs of affection or commentary about the children depicted in images, said Antigone Davis, the global head of safety for the company.
Also Read | Bitter home truths for migrant workers
The company said content that isn’t explicit and doesn’t depict child nudity is harder to define. "Under this new policy, while the images alone may not break our rules, the accompanying text can help us better determine whether the content is sexualizing children and if the associated profile, page, group or account should be removed," the company stated.
The social media company has also started using Google's Content Safety API to better prioritize content that may contain child exploitation for its content reviewers to access and take down.
Besides, it has also started testing two new tools aimed at the potentially malicious searching for child exploitative content and non-malicious sharing of content.
The first is a pop-up that is shown to people who search for terms the apps associates with child exploitation. The pop-up offers ways to get help from offender diversion organizations and shares information about the consequences of viewing illegal content.
The second is a safety alert that informs people who have shared viral, meme child exploitative content about the harm it can cause and warns that it is against company's policies and there are legal consequences for sharing this material.
"Accounts that promote this content will be removed. We are using insights from this safety alert to help us identify behavioural signals of those who might be at risk of sharing this material, so we can also educate them on why it is harmful and encourage them not to share it on any surface — public or private," David added.
The social media giant said it also relies on users to report such content. To do so, it has made it easier to report content for violating its child exploitation policies. The firm has added the option to choose 'involves a child' under the 'Nudity & Sexual Activity' category of reporting in more places on Facebook and Instagram. These reports will be prioritized for review.
Noting that safety is core to Facebook's mission, Karuna Nain, director, global safety policy, Facebook said that the company has trebled its workforce for safety in the past few years in this space.
"We have around 35,000 people who focus on safety and security across the company. Around 15, 000 of these people focus on our community operations or our review teams. Typically that team reviews about two million pieces of content in a day. We are invested in safety for the long haul," she added.
Click here to read the Mint ePaperMint is now on Telegram. Join Mint channel in your Telegram and stay updated with the latest business news.