Stay informed with free updates
Simply sign up to the Artificial intelligence myFT Digest — delivered directly to your inbox.
Given that Google attracts more than 136bn visits a month — as much as the next 12 most popular websites combined — it is easy to see how a US federal judge concluded last year that the company ran a monopoly in online search and had abused its market dominance.
Last week, Judge Amit Mehta heard the final arguments about what remedies to impose. His conclusions are expected in August. The US justice department, which launched the antitrust lawsuit, is demanding that Google be broken up. It wants to force the company to sell its Chrome browser, ban the massive payments it makes to Apple, Samsung and Mozilla to be their default search engine, and share data with competitors. That would surely be a triumph for free-market competition, motherhood and apple pie.
Yet, as ever with regulatory interventions, the skill lies in ensuring that any corrective action anticipates tomorrow’s challenges rather than just trying to fix today’s problems.
That’s especially tough right now. As the judge acknowledged, the digital economy is evolving fast thanks to the light-speed diffusion of artificial intelligence. Some suggest that AI might even undermine the financial viability of the open, human, ad-supported web on which Google has gorged. However much it may (or may not) abuse its market dominance, Google remains the strongest champion of the existing web and many users would miss it if it were gone.
Since being invented 36 years ago, the web has emerged as one of humanity’s most precious resources, giving any online user anywhere access to almost all of the world’s knowledge. It has also become the foundation on which many thousands of digital businesses have been built. But it has its darker deformities and there is a risk that AI might further degrade it in two insidious ways.
First, the big AI companies are in effect strip-mining all websites for content to train their models and paying back almost nothing in return to sustain the ecosystem. At present, Google accounts for about 90 per cent of the global search market and directs a torrent of traffic — and advertising — to content sites. Even though they may (understandably) complain about the terms, some content creators fear the trade itself may be in jeopardy because far fewer users would visit their sites.
AI models aggregate and summarise the web’s content rather than encouraging users to visit the original content source. That may destroy the financial incentives for creating new content. “I think that is a risk,” Laura Chambers, chief executive of the technology company Mozilla, tells me. “How can we ensure that the internet remains healthy?”
The obvious survival mechanism for creators is to build walled gardens by erecting more paywalls around their content, or to move to closed channels off the open web. For many companies, it may make sense to deepen direct-to-consumer business models via social media or their own apps. But both trends would further devalue the richness, utility and universality of the web.
The second threat is that AI models are increasingly flooding the web with machine-generated slop. For the first time in a decade, bots have overtaken humans on the web, accounting for an estimated 51 per cent of total traffic, according to the data and US cyber security group Imperva. The ability of generative AI to create plausible content at minimal cost means this trend is only likely to accelerate.
An AI-mediated web could move it further towards a “dark forest”, an increasingly hostile space populated by predatory bots that seize on any living thing. We may not like our current surveillance capitalism, in which users are tracked and targeted with ads. Subordination capitalism would be still worse.
Yet there are more positive visions of the web’s future, as recently sketched out by Kevin Scott, chief technology officer of Microsoft, which runs Bing, a very distant second in search.
The increasing interaction of bots means that new open protocols are being developed to enable interoperability on this agentic web. That creates the possibility for a different web architecture and a “new deal”, in which “everybody’s incentives are aligned, where the creators and consumers have their interests balanced and there aren’t a bunch of weird intermediaries constraining how utility and value gets exchanged”, Scott told The Verge.
No one yet knows exactly how to build such a glorious future. But whatever Judge Mehta can do to help nudge us in that direction would be appreciated.