New laws are urgently needed to regulate the burgeoning use of Artificial Intelligence (AI) amid mounting concern that Australia's unique culture will be threatened by the loss of creative sector jobs and that public trust in the media will be undermined, says the union for Australia's cultural workforce.
In a submission to the Parliamentary inquiry into the adoption of AI, the Media, Entertainment & Arts Alliance warns that the work of Australian creatives and journalists is being systematically scraped to train AI, without their knowledge, consent, or compensation.
MEAA urges the government to introduce legislation requiring the disclosure of any data used to train AI and enforcing the right for creators to consent to and be compensated to prevent the theft of their work by AI platforms.
MEAA Federal President Michael Balk said there was great unease from members about the rise of AI technologies, their potential to devalue the original work of artists, creators, and journalists, and to mislead and misinform audiences.
"Artificial Intelligence presents the most profound change in the relationship between work and production since the advent of the Internet," he said.
"If left unchecked, the increased use of AI tools poses a profound threat to the credibility and authenticity of artistic and media content presented to audiences, undermining public trust, along with the loss of jobs and the degradation of conditions in creative and journalistic work.
"MEAA members hold grave concerns that policymakers have been too slow to respond to the profound challenges that AI presents to our cultural and working lives.
"We are calling on the government to take urgent action to protect their hard work and livelihoods from AI theft."
A recent survey of MEAA members revealed that three-quarters were extremely concerned about theft of intellectual or creative work. Other key concerns included the potential spread of misinformation (with 74% reporting extreme concern); proliferation of deliberately harmful content (70%); potential loss of human-led creativity (66%); and AI-related job losses (59%).
In its submission to the parliamentary inquiry, MEAA has called for new legislation and greater government oversight to boost transparency and accountability and to mitigate the threat of misinformation and disinformation, .
Industrial relations laws must also be updated to ensure that workers are consulted on any use or intended use of AI in the workplace.
AI has already been at the centre of numerous allegations of copyright theft in Australia, including accusations by a group of voice artists about their voices being cloned and used by AI without their consent.
AI is driving the production of fake Indigenous art, which is being produced and sold online without the consent or compensation of First Nations creatives - further highlighting the need for legislative reform.
The issue is also prevalent in the media, with recent revelations that a senior media organisation executive was involved in setting up websites that republished stories that had been plagiarised from elsewhere and rewritten using AI.
MEAA journalist members have also raised ethical concerns about the impact of generative AI for editorial and production purposes in newsrooms on their ability to comply with the MEAA Journalist Code of Ethics.
Mr Balk said Australia's unique culture had benefited from the contributions of storytellers, artists, actors, dancers, and musicians, while the health of our democracy depended on the commitment of journalists who deliver public interest journalism that informs the public and holds those in power to account.
"Since the beginning of human history, technological change has influenced how we engage in artistic and cultural expression and how we tell stories and report current affairs," he said.
"While this has always been the case, we must be conscious that these creative processes have always required the imagination and technical skill of the human at their centre.
"What would the world be like without the creative workers that tell our stories, inform our communities and hold our institutions accountable?
"Unless we deal with the very real risks that AI poses, we could be about to find out."