News Publishers Fight Back: Inside the Make It Fair Campaign
Owen Meredith, CEO of News Media Association, on why the publishing industry is pushing back against AI companies scraping content without permission
For almost three years, AI companies have been scraping news content to train their large language models, mostly without asking for permission. Some major UK, European, and American publishers have done direct deals with the various LLM companies; however, this is the exception, and only the largest publishers have the weight to negotiate such deals. The publishing industry’s response in the UK is the Make It Fair campaign, which aims to get the LLM and tech companies to pay for the journalism and content that they use.
Owen Meredith is the CEO of News Media Association, which represents around 35 publishers producing approximately 900 titles. I talked to him about why this campaign matters and what’s at stake for the industry
How we got here
The campaign came about because of a government consultation on copyright and AI which was launched in late 2024. The proposal was an opt-out system—AI developers get free access to copyrighted material unless rights holders explicitly opt out. Meredith thinks this is both unfair and unworkable.
“Whilst in theory you can opt out or identify some kind of opt-out on owned and operated platforms, you then need technical compliance from AI developers.”
“There’s so much downstream and syndicated replication of content that you don’t have control over every single platform that your content appears on within that supply and discovery chain.”
The campaign launched to coincide with the consultation’s closure in February 2025, with the message splashed across the front pages of every national and local newspaper in the country. What started as a news industry initiative quickly attracted support from across the creative sector, presenting a unified voice to the government about the importance of copyright protection.
“We wanted to come up with a campaign that everybody could get behind and have a unifying message that was simple to understand, resonated with the consumer, and would achieve our political aim with the government,” says Meredith.
The traffic argument doesn’t hold up anymore
Tech companies have always said they drive traffic to publisher sites for free, so why should they pay? That argument is falling apart. Reuters research shows referral traffic from search engines is down about a third in the last six months. Senior executives reckon it’ll drop by two-thirds over the next five years.
Google’s AI overviews are keeping users within their own “walled garden ecosystem,” using indexed news content but preventing the click-through that publishers rely on. Meredith argues that publishers should have the choice about which platforms they appear on and how they’re monetised.
“If developers want access to rich, high-quality content and data, they should be paying for it in the same way that any other business pays for the component parts of their end product,” he states firmly.
What’s it worth?
News Media Association published research suggesting news content is worth about £1 billion to Google and Meta. That’s not a bill, it’s what the content adds to their platforms. For AI specifically, Meredith says it’s not his job to put a price on content. That’s for publishers and developers to negotiate between themselves.
However, the government’s prolonged consultation process has created damaging confusion. “They’ve created a level of ambiguity, and because it’s now been nearly 12 months since they started that process, they’ve yet to put any clarification back into the marketplace,” Meredith notes. “The embryonic licensing market that did exist has been sort of choked off as a result of government policy and government inaction.”
What Meta did in Canada could happen here
Meta blocked news content in Canada and Australia when faced with regulation. They’d spent over a decade getting publishers to build communities on Facebook, only to pull the plug.
“To then pull the rug out and say, ‘We’ve now captured all of your users, off you go’—that is incredibly unfair business practice,” Meredith says.
Young people still read news brands
There’s a lot of talk about younger audiences moving to TikTok and Instagram, but the Newsworks Youth study tells a different story. 11 million (73%) of 18-34-year-olds access news brands each month—more than use TikTok (10 million) or Snapchat (7 million). This report is available on their website.
“Nine out of 10 young people under 25 read news online. Seven out of 10 access that via a news brand platform,” Meredith says. “On average, under-25s consume six news articles a day.”
The trust gap matters too. Content from news publishers is 85% more trusted than social media content. As people become more sophisticated about what they read online, they’re recognising the value of fact-checked journalism.
The attribution problem
There’s also a reputational problem. AI chatbots hallucinate, sometimes spreading false information on major news brands. “The consumer experience will be weakened if consumers cannot trust the content they read,” Meredith says.
Getting proper commercial relationships in place between AI firms and publishers would help everyone, including consumers who need to trust what they’re reading and where it came from.
What winning looks like
If the campaign works, the UK keeps its copyright regime intact and creates a proper licensing market for news content. That’s good for publishers, good for AI developers who get access to quality content, and good for the UK economy.
The Competition and Markets Authority is now designating platforms with strategic market status, which should help rebalance things. But the industry is still waiting for the government to clarify its copyright policy. In the meantime, publishers are working on technical solutions to stop unauthorised scraping and trying to negotiate the commercial deals that should have existed from day one.
The question was never whether AI companies need news content. It’s whether they’ll pay for it.








