-
ALSO READ
Twitter names resident grievance officer, publishes report under IT rules
Facebook to publish interim compliance report as per IT rules on July 2
FB, Google working on complying with social media rules as deadline looms
Twitter says compliance report by July 11, key staff appointment in 8 weeks
Google, Facebook updating websites to announce grievance officers: Report
-
Facebook "actioned" about 31.7 million content pieces across 10 violation categories proactively during August in the country, the social media giant said in its compliance report on Friday.
Facebook's photo sharing platform, Instagram took action against about 2.2 million pieces across nine categories during the same period proactively.
Facebook had "actioned" over 33.3 million content pieces across 10 violation categories proactively during June 16-July 31 in the country. Instagram took action against about 2.8 million pieces across nine categories during the same period proactively.
On Friday, Facebook said it had received 904 user reports for Facebook through its Indian grievance mechanism between August 1-31.
"Of these incoming reports, Facebook provided tools for users to resolve their issues in 754 cases. These include pre-established channels to report content for specific violations, self-remediation flows where they can download their data, avenues to address account hacked issues etc," it added.
Between August 1-31, Instagram received 106 reports through the Indian grievance mechanism.
Over the years, we have consistently invested in technology, people and processes to further our agenda of keeping our users safe and secure online and enable them to express themselves freely on our platform.
"We use a combination of Artificial Intelligence, reports from our community and review by our teams to identify and review content against our policies," a Facebook spokesperson said.
In accordance with the IT Rules, the company has published its third monthly compliance report for the period for 31 days (1 August - 31 August), the spokesperson added.
"This report will contain details of the content that we have removed proactively using our automated tools and details of user complaints received and action taken, the spokesperson said.
In its report, Facebook said it had actioned about 31.7 million pieces of content across 10 categories during August 2021.
This includes content related to spam (25.9 million), violent and graphic content (2.6 million), adult nudity and sexual activity (2 million), and hate speech (242,000).
Other categories under which content was actioned include bullying and harassment (90,400), suicide and self-injury (677,300), dangerous organisations and individuals: terrorist propaganda (274,200) and dangerous organisations and Individuals: organised hate (31,600).
"Actioned" content refers to the number of pieces of content (such as posts, photos, videos or comments) where action has been taken for violation of standards. Taking action could include removing a piece of content from Facebook or Instagram or covering photos or videos that may be disturbing to some audiences with a warning.
The proactive rate, which indicates the percentage of all content or accounts acted on which Facebook found and flagged using technology before users reported them, in most of these cases ranged between 80.6-100 per cent.
The proactive rate for removal of content related to bullying and harassment was 50.9 per cent as this content is contextual and highly personal by nature. In many instances, people need to report this behaviour to Facebook before it can identify or remove such content.
Under the new IT rules, large digital platforms (with over 5 million users) will have to publish periodic compliance reports every month, mentioning the details of complaints received and action taken thereon. The report is to also include the number of specific communication links or parts of information that the intermediary has removed or disabled access to in pursuance of any proactive monitoring conducted by using automated tools.
For Instagram, about 2.2 million pieces of content were actioned across nine categories during August 2021. This includes content related to suicide and self-injury (577,000), violent and graphic content (885,700), adult nudity and sexual activity (462,400), and bullying and harassment (270,300).
Other categories under which content was actioned include hate speech (37,200), dangerous organisations and individuals: terrorist propaganda (6,300), and dangerous organisations and individuals: organised hate (2,300).
(Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)
Dear Reader,
Business Standard has always strived hard to provide up-to-date information and commentary on developments that are of interest to you and have wider political and economic implications for the country and the world. Your encouragement and constant feedback on how to improve our offering have only made our resolve and commitment to these ideals stronger. Even during these difficult times arising out of Covid-19, we continue to remain committed to keeping you informed and updated with credible news, authoritative views and incisive commentary on topical issues of relevance.
We, however, have a request.
As we battle the economic impact of the pandemic, we need your support even more, so that we can continue to offer you more quality content. Our subscription model has seen an encouraging response from many of you, who have subscribed to our online content. More subscription to our online content can only help us achieve the goals of offering you even better and more relevant content. We believe in free, fair and credible journalism. Your support through more subscriptions can help us practise the journalism to which we are committed.
Support quality journalism and subscribe to Business Standard.
Digital Editor
RECOMMENDED FOR YOU