In response to mounting concerns about the potential impact of Artificial Intelligence (AI) on Australia’s cultural landscape, the Media, Entertainment & Arts Alliance (MEAA) has called for immediate legislative action. The union, representing Australia’s cultural workforce, has warned that AI could threaten jobs in the creative sector and undermine public trust in the media.
In a detailed submission to the Parliamentary inquiry on AI adoption, MEAA highlighted how the work of Australian creatives and journalists is being used without their consent or compensation to train AI systems. MEAA’s Federal President, Michael Balk, emphasized the unease among members regarding AI technologies’ potential to devalue original work and spread misinformation.
“Artificial Intelligence represents the most significant shift in the relationship between work and production since the Internet’s advent,” Balk stated. He cautioned that unchecked AI could degrade the authenticity of artistic and media content, jeopardizing public trust, job security, and working conditions for creatives and journalists.
MEAA’s call for action includes legislation that mandates the disclosure of data used to train AI and enforces the right for creators to consent to and be compensated for the use of their work. This move aims to prevent AI platforms from exploiting creative outputs without due recognition and payment.
A recent survey among MEAA members revealed that three-quarters were extremely concerned about the theft of their intellectual or creative work. Additionally, 74% expressed significant concern over the potential spread of misinformation, 70% were worried about the proliferation of harmful content, 66% feared a loss of human-led creativity, and 59% were anxious about AI-related job losses.
In its submission, MEAA urged the government to implement new laws and enhance oversight to ensure transparency and accountability, mitigating the threats of misinformation and disinformation. The union also called for updates to industrial relations laws to guarantee that workers are consulted regarding any use or intended use of AI in their workplaces.
AI has already been implicated in several copyright infringement allegations in Australia. Notably, a group of voice artists accused AI developers of cloning their voices without consent. There have also been instances of fake Indigenous art being produced and sold online, depriving First Nations creatives of rightful recognition and compensation. This issue underscores the urgent need for legislative reforms.
The media sector has not been immune to these challenges. Revelations have surfaced about a senior media executive involved in setting up websites that republished and AI-rewritten plagiarized stories. MEAA journalist members have raised ethical concerns about how generative AI in newsrooms might compromise adherence to the MEAA Journalist Code of Ethics.
Balk highlighted the critical role of Australia’s unique culture, shaped by storytellers, artists, actors, dancers, and musicians, and the importance of public interest journalism for a healthy democracy. “Since the dawn of human history, technological change has influenced artistic and cultural expression and news reporting,” he said. However, he stressed that these creative processes have always relied on human imagination and technical skill.
“What would the world be like without the creative workers who tell our stories, inform our communities, and hold our institutions accountable?” Balk pondered. “Unless we address the very real risks posed by AI, we might soon find out.”
MEAA’s appeal to the government is clear: to protect Australia’s cultural and creative sectors from the unchecked growth of AI. The proposed regulations aim to safeguard the integrity of creative works and maintain public trust in media, ensuring that the contributions of human creators remain at the forefront of artistic and journalistic endeavors.