Your relationship with Google, especially as an advertiser, is one-sided. It’s no great secret that they are amongst the biggest arbitrage players in the ad-tech world, representing both the buy and sell side allowing price-fixing, preferential inventory for those who pay-to-play and creating an imbalanced ecosystem, yet we continue to pump money into their coffers.
Sure, I had my reservations about Google’s altruism from my experience in ad-tech – who doesn’t? – but it wasn’t until I read “Something is Wrong on the Internet” by James Bridle that their ethical ambiguity stood out in dramatic relief.
The article tackles the issue of bots, which are viewing children’s videos by the tens of millions, creating false numbers to increase advertising revenue – shady, right? Well…it gets worse. These bots are also creating titles for these videos using popular search terms – Peppa the Pig, for example, has around 10.5 million related videos on YouTube Kids, a fraction of which comes from the verified Peppa the Pig account. The titles for these types of videos often come across as a kaleidoscopic word salad, ranging from innocuous but strange (“Anna Shower Noodle Bath Spiderman w/ Good Joker) to flat-out disturbing (“Drowned Head Spiderman and Elsa Under Pool Sponges).
Some of these videos are unlicensed knockoffs, simply regurgitated animation that’s repackaged for ad revenue – children aren’t discerning, so they can get away with it – but some are created by internet trolls designed to frighten children – the New York Times reported on how seemingly innocuous knockoff videos, like one a child had viewed based on PAW Patrol, could have shockingly adult themes hidden with them like death and intense scenes of mourning.
Worse yet, some of these videos are being created by algorithmic bots and the results can be downright terrifying. This linked video is one example of how disturbing this bot-created content can be, especially to children – the crude animation, confusing and often violent themes and nightmarish sound effects could potentially traumatize young viewers. And let’s not forget how YouTube is constructed – the combination of artificially inflated view numbers with popular search terms means these videos are filtering higher and higher on ‘watch next’ lists and becoming increasingly available to button-mashing toddlers.
So what do we do, both as parents and as ad buyers in the space? My first suggestion is not to trust Google off name-recognition alone – at the end of the day, they’re here to make money, sometimes at the expense of web-browsing experience and user privacy, and YouTube is no different. Next, I would be wary of what they pass through – create a tight whitelist of sites you trust and want to run on, and a playlist of videos you are comfortable with your children watching. And third, be wary of bots – brand protection through a myriad pre-bid solutions is better than going it alone.