Young voters don’t trust political ads. The advertising industry wants to change that.
Young voters don’t trust political ads. A new campaign wants to change that before May
With the UK local elections and the Welsh Government and Scottish Parliament going to the polls on 7 May, the advertising industry has launched an updated version of its political advertising literacy guide aimed at 16–24-year-olds. The campaign is a collaboration between Media Smart, the industry’s long-running education programme, and the Advertising Association.
The timing reflects a specific tension that has been building since the 2024 General Election, when the guide was first produced. Young voters are not simply uninformed about political advertising — they are actively suspicious of it. Research from Next-Gen Media’s Youth Panel, surveying more than 2,000 UK-based 16–24-year-olds in April 2026, found that 58% do not trust the political advertising they see. That figure alone is a problem for any campaign strategist. But the more striking number is this: 84% of respondents said they were worried about being manipulated by AI-generated imagery in political advertising.
That concern is not irrational. The rules governing political advertising in the UK differ from those governing commercial advertising. The Advertising Standards Authority, which regulates commercial ads, has no remit over election campaign material. Political parties are instead subject to electoral law, administered by the Electoral Commission, and to platform-specific policies that vary considerably. Stephen Woodford, CEO of the Advertising Association, makes the point directly: “Election advertising is not regulated by the ASA, unlike commercial ads.” For a generation that has grown up with a degree of consumer protection baked into the digital environments they use, that gap is not immediately obvious, and the current media environment makes it harder to spot.
What the guide covers
The 10-point guide, updated for 2026, explains how political advertising operates across different digital platforms, what transparency obligations exist, and how to evaluate what you see.
It includes specific guidance on AI-generated content and on imprints, the legal requirement for political material to identify who produced and paid for it. The imprint issue is not trivial: 46% of those surveyed said they did not know what an imprint is, meaning nearly half of the target audience cannot perform even the most basic authenticity check on the political content in their feeds.
“Political content is increasingly shaped by AI and digital tools, which can make it harder for young people to tell what is real and who is behind the political advertising they see.” “By updating this campaign, we want to help young voters better understand the rules that apply to election advertising and encourage the use of critical thinking skills when looking at political adverts.”
Rachel Barber-Mack, Executive Director of Media Smart
The campaign material was developed by Livity, a youth-focused creative agency that worked on the original 2024 version. This edition also includes expanded guidance for Scotland, reflecting the multi-jurisdictional nature of the May elections, involving three separate electoral processes running simultaneously, which creates a particular challenge for any educational resource trying to be accurate across the board.
Taking the message off-platform
The campaign’s distribution mechanism is worth noting. Rather than relying entirely on social media, itself a contested, algorithmically unpredictable environment for public information campaigns, the campaign has partnered with Next-Gen Media to run six creative executions across a network of digital out-of-home screens at universities and colleges. The network claims a daily reach of around 250,000 young people. Guy Thurlow, Co-Founder of Next-Gen Media, describes the rationale: “Through our digital out-of-home network, we can bring these important campaign messages directly into trusted, real-world environments like universities and colleges.”
The choice of physical environments as the delivery vehicle for a digital literacy message has a certain logic. Universities and colleges carry institutional credibility that social platforms do not, and the audiences in those spaces are, in many cases, engaging with an election for the first time.
The Advertising Association’s involvement reflects a broader industry position that has been developing over the past few years. The trade body has consistently argued that advertising self-regulation — the system it helped establish more than 50 years ago — works precisely because it builds audience trust. Political advertising sits outside that system by design, which creates a structural tension: the same industry that promotes responsible commercial communication has no formal mechanism to apply those standards to political content. Supporting media literacy education is, in part, how the sector responds to that gap without overstepping into territory that belongs to electoral regulators.
For Media Smart, the campaign extends the work the programme has been doing since it was incorporated into the Advertising Association in 2023. The organisation’s resources — which cover everything from influencer marketing to scam advertising — have been downloaded nearly 200,000 times over the past decade, and the programme claims to have reached 12 million young people. It is funded by a cross-industry coalition of advertisers, agencies, media owners, trade organisations and technology companies.
The political advertising guide is available to download via Media Smart’s website. The out-of-home campaign is now running across the Next-Gen Media network, ahead of the 7 May polling dates.
The underlying numbers from the Youth Panel survey speak to a broader erosion of trust in information environments that the advertising and media industries will need to keep engaging with well beyond this election cycle. A generation that enters voting age already sceptical of the political content it encounters — and specifically worried about AI manipulation — is not a problem that a single educational guide can resolve. But the industry’s capacity to reach young people at scale, with resources grounded in actual platform mechanics rather than generalised media warnings, is a more substantive intervention than the format suggests.








