The vast majority of videos removed from YouTube toward the end of last year for violating the site’s content guidelines had first been detected by machines instead of humans, the Google-owned company said on Monday.
YouTube said it took down 8.28 million videos during the fourth quarter of 2017, and about 80 percent of those videos had initially been flagged by artificially intelligent computer systems.
YouTube reveals it removed 8.3m videos from site in three months
YouTube says it removed 8.3m videos for breaching its community guidelines between October and December last year as it tries to address criticism of violent and offensive content on its site.
The company’s first quarterly moderation report has been published amid growing complaints about its perceived inability to tackle extremist and abusive content.
YouTube publishes deleted videos report
YouTube's first three-monthly “enforcement report” reveals the website deleted 8.3 million videos between October and December 2017 for breaching its community guidelines.
The figure does not include videos removed for copyright or legal reasons.
Sexually explicit videos attracted 9.1 million reports from the website's users, while 4.7 million were reported for hateful or abusive content.
Most complaints came from India, the US or Brazil.