YouTube pulled down over 8 million videos during Oct-Dec 2017

Video-streaming platform YouTube had pulled down more than 8 million videos in the last quarter of 2017, it said in a Google Transparency Report.

By: Tech Desk | New Delhi | Updated: April 24, 2018 6:53:16 pm
YouTube, YouTube transparency report, YouTube videos removed, How to flag videos on YouTube, YouTube flagging of videos, YouTube videos The total count of objectionable videos removed by YouTube in Q4 2017 stands at 8,284,039.

YouTube has revealed that it pulled down over 8 million objectionable videos in the last three months of 2017, in its first ever transparency report. According to YouTube, most of these videos were spam or they contained adult content, and the report adds, that these “represent a fraction of a percent of YouTube’s total views during this time period.” The total count of objectionable videos removed by YouTube in the October-December 2017 period stands at 8,284,039.

In addition, the video-streaming service revealed that nearly 6.7 million of these videos were flagged by a machine, rather than a human. YouTube says that 76 per cent of these videos were removed before any user watched them. Over 1.13 million videos were removed by ‘Individual Trusted flagger’ revealed YouTube. The Trusted Flagger program from YouTube allows individual users, government agencies, and NGOs to notify the platform of community guideline violations on the platform.

Among these categories of Trusted Flagger program members, 1,131,962 reports came from Trusted Individual flaggers, while 402,335 requests were reported via other users. Also, 63,938 reports originated from NGOs and 73 from government agencies. YouTube’s report also showed that India recorded the highest number of human flagging among all countries.

The other countries having a large volume of human flaggers include the US, Brazil, Russia and Germany. Among the reasons cited for the removal of videos, YouTube said over 30 per cent of these videos were ‘sexually explicit’. Other top reasons claimed for removing videos include ‘spam or misleading content’, which was reported in 26.2 per cent of all cases, and ‘hateful or abusive’ content being claimed for 15.6 per cent of videos.

Among these videos, only 1.6 per cent, or 490,667 videos, were pulled down as they ‘promoted terrorism’. In the report, YouTube stated that if flagged videos do not violate its terms, they could either be restricted, that is released for age-appropriate audiences, or be left ‘live’, when the guidelines are not violated. Google’s video platform is also rolling out a Reporting History dashboard for each user. This will allow each YouTube users to individually access and see the status of videos they’ve flagged to the company for review against the Community Guidelines.