YouTube took down more than 58 million videos and 224 million comments during the third quarter based on violations of its policies, the unit of Alphabet Inc's Google said on Thursday in an effort to demonstrate progress in suppressing problem content.
Government officials and interest groups in the United States, Europe and Asia have been pressuring YouTube, Facebook Inc and other social media services to quickly identify and remove extremist and hateful content that critics have said incite violence.
Hateful and violent videos are a sliver of the content YouTube removes
YouTube removed 7.8 million videos and 1.6 million channels in the third quarter of this year, mostly for spreading spam or posting inappropriate adult content, the company said in a report Thursday.
The Community Guidelines Enforcement Report comes amid growing questions — including in a congressional hearing Tuesday — about how YouTube monitors and deletes problematic content from the platform, including videos depicting violent extremism and hateful, graphic content. Such videos remain a small percentage of the overall number that YouTube deletes, but the prevalence of such content has been the subject of news reports and congressional scrutiny.
YouTube removed 58 million videos in latest quarter
YouTube removed 58 million videos between July and September this year because they broke community guidelines.
More than 7.8 million of those videos were taken down because they violated community guidelines. The other 50.2 million were taken down as YouTube removed 1.67 million channels.
The online video platform said 72 percent of the videos removed for violating guidelines in the latest quarter were “spam or misleading,” 10.2 percent were removed out of concern for “child safety” and 9.9 percent were removed for including “nudity or sexual content,” according to its latest report.