Artificial intelligence (AI) now represents a significant threat to Australia’s creative industries, prompting calls for enhanced government regulation, as highlighted in a Senate committee hearing this week.
The Select Committee on Adopting Artificial Intelligence conducted a series of public hearings in Canberra, where representatives from the local creative sector expressed serious concerns. Key issues discussed included rampant copyright infringement linked to generative AI development, the displacement of human creative workers by AI and insufficient legal safeguards.
Claire Pullen, CEO of the Australian Writers’ Guild (AWG), emphasised to the committee that generative AI models are detrimental to the livelihood of creative professionals. She asserted that these models “rely on infringing Australian intellectual property”, undermining the ability of creative workers to earn a living. Pullen advocated for a regulatory framework enabling the detection of intellectual property infringements affecting their members.
A collaborative submission from organizations representing writers, screen editors, production designers, and cinematographers underscored the necessity for clear guidelines. The submission called for “unambiguous guidelines to encourage the use of only safe and responsible AI, reinforced by rigorous, forward-looking legislation to provide strong protections” for the Australian creative sectors.
The Issue of Digital Cloning in the Voice Acting Industry
Representatives from the Australian Association of Voice Actors (AAVA) also presented their case to the committee, revealing the coercive tactics employed by large organisations and production companies. These tactics involve compelling voice actors to sign contracts that permit the creation of digital clones of their voices for future use, often without their consent.
Simon Kennedy, President of AAVA, detailed how these organisations exploit the financial vulnerabilities of voice actors, using essential work opportunities as leverage to secure consent for digital cloning. Kennedy described the constant and sometimes aggressive pressure faced by voice actors, noting that contracts predating generative AI technology are being misused to justify training language models and creating digital clones.
One prominent example involved Audible, owned by Amazon, which allegedly includes clauses in its contracts allowing for the creation of digital versions of voice actors’ voices. Kennedy explained that these clauses leave room for the voices to be cloned for future products without any discussion of equitable compensation, effectively replacing human actors with their digital counterparts.
Cases of Unauthorised Use and Industry Concerns
The AAVA provided several instances of unauthorized use of digital voice clones. One case involved an online animation series that purportedly cloned the voices of contracted performers without their knowledge or consent. Two voiceover artists had their contracts terminated prematurely, only to discover that their AI-generated voices were being used in new episodes.
The organisation also highlighted an Australian radio network investing in technology to replace human voice actors, a move seen as a betrayal of an industry that has relied on voice actors for over a century to deliver quality and authenticity.
In another instance, a voice actor contracted with an AI text-to-speech company found their voice being used in pornography ads, despite contractual assurances that it would not be employed for explicit content. This misuse exemplifies the broader issue of AI technology being exploited without adequate oversight or consent.
Calls for Comprehensive Reforms
In response to these issues, the AAVA urged the committee to recommend a series of reforms aimed at safeguarding the creative sectors from AI-related harms. The organization warned that without stringent regulations, the voice-acting industry could become dominated by a few technology companies, with dire consequences for professionals and the broader community.
The AAVA stressed the need for a balanced approach that protects the rights and livelihoods of creative workers while fostering responsible AI development. They concluded that unchecked AI advancements could lead to catastrophic outcomes for all Australians who rely on their voices for their livelihood and personal expression.
As the Senate committee deliberates on these matters, the call for stronger regulations echoes across the creative industries, underscoring the urgent need to address the challenges posed by AI to ensure a sustainable and fair future for all stakeholders involved.






