YouTube Moves to Make Conspiracy Videos Harder to Find

Whether it is a video claiming the earth is flat or the moon landing was faked, conspiracy theories are not hard to find on Google’s YouTube. But in a significant policy change, YouTube said on Friday that it planned to stop recommending them.

After years of criticism that YouTube leads viewers to videos that spread misinformation, the company said it was changing what videos it recommended to users. In a blog post, YouTube said it would no longer suggest videos with “borderline content” or those that “misinform users in a harmful way” even if the footage did not violate its community guidelines.
https://www.nytimes.com/2019/01/25/technology/youtube-conspiracy-theory-videos.html

Also see:

YouTube is changing its algorithms to stop recommending conspiracies
YouTube said Friday it is retooling its recommendation algorithm that suggests new videos to users in order to prevent promoting conspiracies and false information, reflecting a growing willingness to quell misinformation on the world’s largest video platform after several public missteps.

In a blog post that YouTube plans to publish Friday, the company said that it was taking a “closer look” at how it can reduce the spread of content that “comes close to — but doesn’t quite cross the line” of violating its rules. YouTube has been criticized for directing users to conspiracies and false content when they begin watching legitimate news.
https://www.washingtonpost.com/technology/2019/01/25/youtube-is-changing-its-algorithms-stop-recommending-conspiracies/

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.