From the Financial Times to the Front Lines: David Buttle on AI-Resilient Publishing
Addressing AI Licensing’s Two-Tier System
David Buttle spent over a decade at the Financial Times. Now he runs DJB Strategies, a consultancy helping publishers navigate what he sees as an existential shift in how content reaches audiences—his focus: AI platforms, search evolution, and regulatory changes reshaping journalism economics.
The FT’s success came down to something simple. “Remaining laser focused on serving a defined need for a defined audience group,” Buttle says. That clarity, knowing exactly who you’re serving, gave the publication an advantage no amount of technology could replicate. “The FT puts the reader at the centre of its thinking. The editorial is laser-focused on how you super serve that reader, and the business part sits around that.”
The FT recognised early that consumer subscription revenue was plateauing. “You’re competing for share of how many subscriptions any person will have - even a news junkie is probably only going to have two or three.” The answer was B2B growth, where the FT’s usage-based pricing - charging only for staff who actually consume content above certain thresholds - makes commercial sense.
The Distribution Problem Nobody’s Solving
When Buttle left the FT after running public affairs and platform strategy, he noticed something. “There are a lot of businesses focusing on deploying AI technologies in operational domains or editorial settings. But not many people are thinking about how, as a content business, you need to adapt to continue succeeding in a new distribution paradigm.”
His quarterly Platform Strategy & Risk Monitor tracks hundreds of developments across AI, search, and regulation. Publishers without dedicated public affairs functions struggle to separate signal from noise. Buttle’s job is to provide “actionable advice on how they ought to be thinking about responding.”
The immediate problem is AI licensing. The deals being struck - opaque, often economically net negative when you account for off-platform summarisation costs - create a two-tier system. Large publishers get modest revenue and prominent placement. Everyone else gets scraped without compensation.
“This is troubling, especially when deals drive both revenue and prominence,” “In search and social, there was an algorithm you could optimise towards. These deals aren’t like that - it’s a group of people in a private organisation making decisions about who they want to deal with.”
Worse, most deals don’t include usage data. “Unless that’s rectified, there’s a risk that rights holders won’t be able to attach pricing and value to their content properly.”
Buttle’s particularly concerned about AI browsers, tools that hit publisher websites to retrieve information users never see. “It cuts to the heart of the definition of what is a human visitor versus a non-human visitor.” If an AI browser summarises content but never scrolls down the page, is that a human visit? As Opera launches Neon and The Browser Company releases Dia, these aren’t theoretical questions.
Standardisation will probably come, likely driven by Google under regulatory pressure. The UK’s CMA is working now on conduct requirements that are likely to include granular controls for publishers over the use of their content across search and AI. But it may arrive too late. Apple’s testimony that Google searches from Safari dropped for the first time in April caught attention. “AI tools will be used in place of conventional search in many instances in the future, although at the moment this is contained to a small subset of users, principally on desktop” Buttle says.
Google launched AI Mode, knowing it would cannibalise search ad revenue. Why? “It has no choice but to protect its core business from the threat represented by OpenAI.”
What Actually Works
Publishers face fundamental choices. “Businesses built around scale models - free to access, ad-funded scale publishing - are structurally challenged by this distribution environment.” The future belongs to serving defined audiences with direct relationships you can monetise through reader payment.
Content type matters. “If you just need a piece of information about how to do something, and that’s what you’re publishing, it’s going to be hard to make that work apart from via licensing. But highly visual multimedia content, analysis, opinion-led, personality-led content - that stuff you still need to go to a destination to read.”
Building an AI-resilient business means two things at this moment in time: creating owned-and-operated experiences that continue to draw loyal, direct, monetisable audiences, and alongside that, experimenting with licensing. The licensing market isn’t liquid yet, but “having a head start on understanding where you want to fit into that world puts you on a better strategic footing.”
This requires brutal honesty about which revenue streams AI can replace. “If your content can be easily reconstructed from public information, you’re in trouble,” Buttle warns. Publishers need strategic principles: under what circumstances do you license what content, to whom, and what do good contractual terms look like?
Asked if he’s optimistic or pessimistic about 2026, Buttle pauses. “The industry will probably get smaller. But the businesses that thrive will be the ones creating the most customer value, the most audience value. It’s going to be a period of disruption, but there are paths through this.”
Those paths depend on which part of the sector you occupy and whether you can build something new while keeping your existing business running. For publishers still waiting for clarity before acting, Buttle’s message is blunt: the future’s already here.





