As per its maiden monthly compliance report mandated by the new IT rules, Facebook took down over 30 million content pieces across 10 violation categories during May 15-June 15 in India. Facebook-owned Instagram also took down about two million pieces across nine categories during the same period.
It also provided details of the content actioned on Instagram. Facebook described 'content actioned' as content such as posts, photos, videos, or comments it takes action on for going against its standards. Facebook had earlier said this would be an interim report.
The new IT rules require large digital platforms (with over 5 million users) to release a compliance report every month, stating the details of complaints received and action taken thereon.
Facebook will publish its next report on July 15, containing details of user complaints received and action taken.
Facebook removed 25 million pieces of content identified as 'spam'.
It took down 1.8 million pieces of content containing 'adult nudity and sexual activity', of which 99.6% was done proactively.
Actions were also taken in cases of suicide and self-injury-related content numbering 589,000.
On Instagram, the highest number of content actioned (699,000) was under the suicide and self-injury category, of which the proactive rate was 99.8%.
This was followed by 668,000 pieces of 'violent and graphic content'.
'Adult nudity and sexual activity' led to action on 490,000 pieces of content.
Google was the first to publish a transparency report in accordance with the new IT rules on June 30.
That report covered complaints received and actioned between April 1 and 30.
Google said there would be a two-month lag for reporting to allow sufficient time for data processing and validation.
The total number of complaints received by Google in the reported period was 27,762, of which 96.2% were related to copyright.
The number of removal actions taken by Google based on these complaints was 59,350.