The Controversy Unfolds
In recent weeks, ScotRail faced widespread backlash after unveiling an artificial intelligence-generated announcer voice named Iona. At first glance, the move seemed like a technological upgrade, an attempt to modernize passenger services and highlight the potential of AI-driven solutions. Yet the voice immediately drew criticism for a striking reason: it sounded almost identical to Scottish voice actor Gayanne Potter, whose voice had long been associated with train announcements across Scotland.
Potter alleged that her voice had effectively been cloned without her permission. The voice she built her career upon a core part of her professional identity, had been replicated by a machine, threatening her livelihood and raising profound questions about consent in the digital age. Within days, the story shifted from a simple software trial to a wider debate about ownership, intellectual property, and the human cost of unchecked AI adoption.
The backlash was swift and decisive. Passengers criticized ScotRail’s choice on social media, industry experts weighed in on the legal grey areas of voice replication, and colleagues in the voiceover industry called the move exploitative. Facing mounting pressure, ScotRail announced it would discontinue use of Iona. While the decision marked a victory for Potter, the controversy sparked a conversation far bigger than one company’s experiment.
The Significance of Voice Identity
At the center of this case lies the issue of voice identity. For professional voice over actors, the voice is more than just a biological feature; it is a cultivated tool, refined through years of training and experience. It is also deeply personal. Just as a writer has ownership over their words and a painter over their brushstrokes, a voice actor has ownership over their sound.
Potter’s reaction reflected this reality. Having worked extensively in broadcasting and voiceover work, she understood the value of her voice not only as an income source but also as part of her creative legacy. By introducing an AI-generated clone without her consent, ScotRail inadvertently challenged the very concept of voice ownership. Was her voice simply a set of frequencies that could be replicated at will, or was it intellectual property deserving the same protections as visual likenesses or written works?
This question is not new, but the ScotRail case thrust it into the public spotlight. In recent years, advances in text-to-speech synthesis and deep learning have made it possible to mimic voices with minimal audio samples. While this technology has legitimate uses, such as accessibility tools for people who lose the ability to speak, it also creates risks of exploitation, impersonation, and job displacement.
Industry Implications: Beyond ScotRail
Though ScotRail has ended its trial, the controversy highlights challenges that extend across industries. The entertainment sector, advertising, and even audiobook narration are already grappling with the rise of AI-generated voices. Contracts for voice actors increasingly include clauses addressing AI training, with unions pressing for stronger safeguards.
The ScotRail case illustrates what can happen when organizations deploy AI voices without clear ethical frameworks. While the company claimed it never intended to cause harm, the result was reputational damage, legal scrutiny, and public distrust. Other organizations considering similar technology are now on notice: transparency and consent are non-negotiable.
The incident may also accelerate calls for new legislation. At present, many legal systems lack explicit protections for voice identity. Laws around image rights and likeness already exist in several countries, but voice rights remain a developing area. As AI tools become more sophisticated, governments will face growing pressure to close these gaps and protect performers from unauthorized replication.
Human vs. Artificial: What’s at Stake
A key takeaway from the ScotRail controversy is that AI cannot replace the human qualities of performance. Train announcements may seem like a straightforward service, but the familiar cadence, warmth, and reassurance of a human voice carry emotional resonance. Passengers in Scotland associated Potter’s voice with dependability, and the abrupt switch to an artificial clone felt disingenuous.
This distinction extends to all forms of voice work. AI can replicate tone and inflection, but it struggles to capture the lived experience behind a performance. Subtle elements such as the smile in a narrator’s delivery, or the tension in a dramatic pause, often come from human instinct rather than algorithms. For many fans and professionals, this is what makes voice acting a craft rather than a function.
The risk of overreliance on AI voices is not just economic; it is cultural. If companies prioritize cost-cutting over artistry, audiences may eventually face a world of homogenized performances, stripped of the individuality that makes characters and announcements memorable. The ScotRail case illustrates that audiences notice, and they care.
A Broader Conversation About Consent and Technology
In the aftermath of ScotRail’s decision, one theme became clear: consent must be at the heart of AI voice use. For voice actors, this means contracts should explicitly outline how recordings can and cannot be used. For companies, it means recognizing that innovation without permission can lead to backlash and harm. For audiences, it means understanding the value of human artistry in a world increasingly mediated by technology.
Potter’s victory may serve as a precedent for other performers facing similar issues. It demonstrates that public pressure, combined with professional advocacy, can hold organizations accountable. But it also signals a need for proactive solutions. Industry bodies, unions, and legislators will need to work together to ensure that AI serves as a complement to human talent, rather than a means of exploitation.
The ScotRail AI announcer was only one experiment, but the questions it raises are universal. As AI tools become more embedded in daily life, society must decide where to draw the line between innovation and infringement. For now, the voice of Iona has been silenced, but the conversation it sparked has only just begun.

