Google Training Ad Placement Computers to Be Offended

Over the years, Google trained computer systems to keep copyrighted content and pornography off its YouTube service. But after seeing ads from Coca-Cola, Procter & Gamble and Wal-Mart appear next to racist, anti-Semitic or terrorist videos, its engineers realized their computer models had a blind spot: They did not understand context.

Now teaching computers to understand what humans can readily grasp may be the key to calming fears among big-spending advertisers that their ads have been appearing alongside videos from extremist groups and other offensive messages.

Also see:

Google exec says advertising problem is 'very, very, very small'
Google’s chief business officer, Philipp Schindler, has claimed that the company’s problem with adverts running on extremist material on YouTube affects “very, very, very small numbers”, but that the company has implemented a wide range of features to try and solve it anyway.

Leave a Reply

Your email address will not be published. Required fields are marked *