The panic over an internet after cookies is growing, observes Faris Yakob, and their use is predicated on a strange and deeply flawed idea of how advertising works in the first place – a rethink is long overdue.

Google recently made a remarkable announcement about the future of digital advertising. In 2020 they heralded that they were going to remove support for third-party cookies from Chrome, the world’s biggest browser with two thirds of the market. In March 2021, they elaborated that they would not build or allow any alternate individual identifiers. They went on to say that “people shouldn’t have to accept being tracked across the web in order to get the benefits of relevant advertising. And advertisers don't need to track individual consumers across the web to get the performance benefits of digital advertising.” 

That is quite a statement coming from the world’s leading tracking-based advertising company. It sent shockwaves through the media-industrial complex. The Trade Desk, a software platform for buying digital advertising that uses tracking data, saw its stock price drop more than 20% following the announcement. It is still valued by the stock market as worth more than WPP and Omnicom combined, though, and on a fraction of the revenue, because investors like scalable businesses (i.e. software-based) with lots of room to grow, and because the stock market has not been trading on fundamentals, let’s say, for some time. 

It’s not just Google, although the media pundits have done a dramatic double take, watching them reposition by hiring the Chief Product Officer of privacy-focused browser Brave and having him write the blog post announcing their new commitment to privacy and abhorrence of the unnecessary surveillance that they made their fortune on. Apple has long championed user privacy as a point of principle / profit, because they prefer to maintain complete control of the data on their platform and have no interest in the advertising business. It’s also a useful club for them to wield in their escalating battle with other technology companies whose revenue is almost entirely based on targeted advertising. With iOS 14 Apple will turn off its Identifier for Advertisers (IDFA) by default and introduce App Tracking Transparency (ATT). That means every app will ask you to opt-in to being tracked and the assumption is that the majority of people will choose not to. Defaults matter. 

In his keynote speech at the Computers, Privacy and Data Protection Conference (CDPD) (technologists love acronyms) Tim Cook echoed Google’s announcement, saying “technology does not need vast troves of personal data stitched together across dozens of websites and apps in order to succeed. Advertising existed and thrived for decades without it.”

I remember being surprised by the reaction to the Netflix documentary The Social Dilemma. I assumed everyone already understood all this, but that’s the fachidiot problem, as the Germans call it. 

A fachidiot is a ‘subject matter idiot’, blinded by his knowledge, unable to understand the world writ large. People didn’t understand the implications of scandals like Cambridge Analytics, or how adtech works in general, and the documentary helped clarify some elements, whilst eliding others. Regardless, the tide seems to have turned away from what is sometimes called ‘surveillance capitalism’. Cook went further, laying the blame for our increasingly polarized societies, the decline of trust in institutions and the media, and even the recent attempted coup in the USA at the feet of personalized tracking, adtech, and targeted advertising. “A social dilemma cannot be allowed to become a social catastrophe,” he said. That’s a lot for the industry to consider. 

Whatever you believe about the broader negative externalities of adtech, third-party cross-site tracking is going away and so will much of Apple’s app tracking and ad targeting. Arguably this isn’t a very significant problem, considering the accuracy of that third-party data. An analysis by researchers from MIT in 2019 of over 90 third-party audiences across 19 data brokers revealed that the cookies identified gender correctly only 46% of the time on average. Since that is less accurate than guessing randomly it seems hard to justify the additional budget. 

In a post-cookie world, advertisers will have to rely on first-party data from visitors to their own sites and CRM platforms, or they can target based on the context of the content their ads appear around. This is somewhat satisfying, since I personally dislike being stalked across the web and have railed against what is called personalized advertising for many years. I have tried to suggest that media planning move away from the high frequency algorithmic arbitrage it seemed to settle on as its most profitable business model in the last decade. Advertising works at a socio-cultural level, which means personalized advertising is more akin to credit card offers in the mail (and no one wants those.)

It was twenty years ago that the film Minority Report bedazzled the advertising industry into personalization. As Tom Cruise fled the pre-crime police force he once worked for, a series of holographic posters identified him via his retina, suggesting he drink a Guinness to calm down, or grab a Lexus to facilitate his getaway. It’s a disturbingly dystopian scene, one which the advertising industry nonetheless grabbed on to as a robust vision of its future in the early digital age. That future, it seems, is now past.