The recent outcry over Google’s AI Overviews and their alleged “appropriation” of journalistic content has produced a storm of headlines, policy discussions, and even legal threats. The Guardian reports that AI summaries cause a “devastating” drop in traffic to news sites. Studies from Pew Research Center and SEO firm Authoritas bolster the case: when Google serves users a synthetic summary instead of the familiar ten blue links, people stop clicking through. They read the AI’s synthesis and move on. Traffic plummets. Newsrooms panic.
But before we follow media executives into the courtroom or the graveyard, it’s worth asking a more fundamental question: what, exactly, are AI systems accused of stealing? Are they reproducing journalism, or replacing something more superficial? And if machines are capable of the kind of mental labor that journalism once demanded exclusively of humans, does the complaint hold water at all? Or are we watching not a theft, but a transfer of function—a shift in how knowledge is gathered, synthesized, and distributed in the 21st century?
To ask these questions is to look beneath the surface of the traffic metrics and into the philosophy of information labor itself.
The Studies: What They Actually Show
The studies making the rounds are not inconsequential. Authoritas, a UK-based SEO analytics company, modeled how Google’s AI Overviews impact visibility and estimated that when an AI box appears at the top of a search page, the previously top-ranked organic link may lose up to 79% of its traffic for that query. The loss is driven primarily by position: the AI box swallows the prime real estate, and everything below is pushed “hundreds of pixels” down—out of sight, especially on mobile.
Pew Research, meanwhile, offers something rarer and more empirical. By tracking the real-world search behavior of 900 U.S. adults over the course of a month, Pew was able to quantify just how dramatically behavior shifts when AI Overviews appear. The numbers are stark. Clickthrough rates on traditional search results drop from 15% to 8% when AI is present. The rate of clicking a link within the AI summary itself? Just 1%. Worse, in 26% of such searches, users simply abandon the session after reading the summary—suggesting not just disinterest in the original source, but satisfaction with the AI response.
For news publishers, this is the stuff of nightmares: their work ingested, distilled, and displayed by a platform that reaps the ad revenue while cutting off their oxygen supply.
But all of this rests on an implicit assumption: that what is being extracted is journalism.
What Is Journalism, Really?
Is journalism the output—the finished product, the string of sentences with a byline attached? Or is it a process—a human process, of curiosity, verification, synthesis, and context building? If it’s the former, then yes: a 70-word AI summary based on the first few paragraphs of a news article can be said to steal something essential. But if it’s the latter, we have to ask whether most of what gets ingested by large language models actually qualifies.
Much of what populates today’s digital news ecosystem is not original reporting. It is aggregation, regurgitation, SEO-targeted listicles, headlines designed to provoke rather than inform. When an AI system ingests five articles that themselves summarize the same press release, has it stolen anything? Or has it merely completed the loop?
The uncomfortable truth is that a significant portion of what is labeled journalism today is already machine-replicable. Summarization, rewriting, even low-level reporting tasks can be performed by contemporary AI systems with uncanny fluency. So the notion of “stealing” begins to look shaky. What is being taken is not necessarily something irreplaceable or deeply human. It’s output that already skirts the edge of automation.
The Programming Analogy
Consider, by way of comparison, the world of software development. Ten years ago, writing code was the sacred core of the profession. Today, tools like GitHub Copilot and Grok-4 can generate functional code from natural language prompts. Entire apps are scaffolded in minutes. Does this mean programming is being stolen? Or that it is being redefined?
The answer lies upstream. Programming, like journalism, is more than just its output. Good software depends on system analysis, requirements elicitation, stakeholder interviews, contextual awareness, and a deep understanding of the problem space. A script that compiles is not a solution. The real value lies in defining the right thing to build.
Similarly, journalism’s beating heart is not in the recap or the summary, but in the initial act of discovery: the phone call to a reluctant source, the FOIA request, the analysis of a leaked dataset, the skeptical line of inquiry that challenges the dominant narrative. These are things AI cannot yet do on its own. But once it can—once artificial systems can not only summarize but investigate, verify, and contextualize on their own initiative—then we must admit the game has changed.
And perhaps it already is. AI systems today can ingest terabytes of documents, cross-reference inconsistencies, identify statistical anomalies, and highlight buried patterns faster than any human reporter. If given access to raw sources—contracts, filings, satellite imagery, sensor logs—they can construct narratives that resemble what a human journalist might produce after weeks of labor.
What they lack, for now, is initiative. No algorithm wakes up with a hunch. No model pounds the pavement or calls a whistleblower. But those gaps are closing.
Google as Mirror, Not Thief
So what is Google actually doing with its AI Overviews? It is synthesizing public information into readable answers. It is providing convenience. It is, in effect, giving people what they came for. If the summary suffices, was the long-form article ever necessary? If not, perhaps the complaint lies not with Google, but with the shallowness of the underlying content.
The studies don’t lie: traffic is down. Clicks are down. But those are symptoms, not causes. The root issue is that for many queries, the user’s need can be satisfied with 100 well-formed words. If those words are assembled by AI from freely available sources, then the value proposition of traditional journalism comes under pressure.
That pressure is not unfair. It is evolutionary.
Just as programmers had to move up the stack—from syntax to systems thinking—journalists must move up as well. They must become investigators, analysts, ethicists, explainers. The machine can handle the summary. It cannot yet handle meaning.
And that’s the inflection point. If your journalism is easily summarized, it will be. If it resists summarization—because it is deep, original, surprising, or complex—then it retains its value. Not everything worth knowing fits in a box at the top of a search result.
The Future of Journalism is Post-Journalistic
The real question, then, is not whether Google is unfair, but whether journalism is evolving. The notion of being “the first draft of history” still matters, but the tools have changed. We already use AI to transcribe interviews, clean audio, identify faces in crowds, scrape datasets. The barrier to full automation is no longer technical; it is conceptual.
What happens when an AI system is trained not just to summarize news, but to create it? When it scans court filings and identifies a pattern of abuse? When it cross-references environmental data with corporate reports and spots fraud? When it sends queries to databases, calls APIs, and composes an exposé? At that point, it is not assisting journalism. It is journalism.
In such a world, the distinction between content creator and content consumer blurs. The value of the journalist is not in typing fast or interviewing well, but in setting the parameters of inquiry: what matters, what deserves scrutiny, what frame we bring to the data. AI can get us facts. Humans must still determine what those facts mean.
Until they can’t.
Conclusion: Not Theft, but Transition
So no, Google is not stealing journalism. It is reflecting its weaknesses. The AI summaries at the top of your search page are not acts of appropriation. They are litmus tests. If your work can be replaced by a paragraph, perhaps it should be.
But if it cannot—if it defies reduction, if it invites reflection, if it changes what we think is true—then it will still matter. And people will still find it. Because in the end, even the most powerful AI cannot summarize what it does not understand. Not yet.
The challenge for journalism is not to beat the machine, but to transcend the level at which the machine operates. The same was true for programmers. And musicians. And translators. And artists.
The question is not “How do we stop AI?” but “What can we become that it cannot yet be?”
That is not a retreat. It is a renaissance.
Let the summaries go.
Write the stories that cannot be summarized.