AI Deepfakes Threaten Democracy, Personality Rights Aid

How much is your voice worth?

Author

  • Wellett Potter

    Lecturer in Law, University of New England

It could be as little as roughly A$100. That was how much ABC News Verify recently spent to clone federal senator Jacqui Lambie's voice - with her permission - using an easily accessible online platform.

This example highlights how artificial intelligence (AI) apps which create a synthetic replica of a person's image and/or voice in the form of deepfakes or voice cloning are becoming cheaper and easier to use.

This poses a serious threat not only to the functioning of democracy (especially around elections), but also to a person's identity.

Current copyright laws in Australia are inadequate when it comes to protecting people if their image or voice is digitally cloned without their permission. Establishing "personality rights" could help.

Detecting what's fake is difficult

Deepfake technology is able to produce content which seems increasingly real . This makes it harder to detect what is fake and what is not. Indeed, several people for whom the ABC played the voice clone of Senator Lambie did not initially realise it was fake.

This shows how unauthorised deepfakes and voice cloning can be easily used to generate misinformation . They can also be extremely damaging to individuals .

This was highlighted back in 2020, when one of Australia's first political deepfake videos was released . It featured the then Queensland premier Annastacia Palaszczuk claiming the state was "cooked" and in "massive debt".

The video received around 1 million views on social media.

What laws cover this?

In Australia, defamation , privacy , image-based abuse laws , passing off and consumer protection laws might be applicable to situations involving deepfake video or audio clips. You may also be able to lodge a complaint with the eSafety commissioner .

In theory copyright law can also protect a person's image and voice. However, its application is more nuanced.

First, a person whose likeness has been cloned by an AI platform often does not own the source material. This material could be an image, video or voice recording which has been copied and uploaded. Even if your image and voice is depicted, if you are not the owner of the source material, you cannot sue for infringement.

Using Senator Lambie as an example, the ABC only needed 90 seconds of original voice recording to create the AI clone. Senator Lambie's voice itself is not able to be copyright-protected. That's because copyright can only attach to a tangible expression , say in written or recorded form. It cannot attach to speech or unexpressed ideas.

As the ABC arranged, recorded and produced the original 90-second recording, the broadcaster could hold copyright in it as a sound recording. It is a fixed, tangible expression of Senator Lambie's voice. However, unless the senator and the ABC made an agreement, Senator Lambie would have no economic rights, such as the right to reproduction, to the original voice recording. Nor would she have any rights to the clone of her voice.

In fact, the AI-generated clone itself is unlikely to be protected by copyright, as it is considered authorless under Australian copyright law . Many AI-generated creations are currently unable to be protected under Australian copyright, due to a lack of original, identifiable human authorship.

Moral rights - including the right of attribution (to be credited as the performer), the right against false attribution and the right of integrity - are also limited in scope. They could apply to the original audio clip, but not to a deepfake.

What are 'personality rights'?

In most jurisdictions in the United States, there exist what are commonly known as " personality rights ". These rights include the right of publicity, which acknowledges that an individual's name, likeness, voice and other attributes are commercially valuable.

Celebrities such as Bette Midler and Johnny Carson have successfully exercised this right to prevent companies using elements of their identity for commercial purposes without permission.

However, personality rights might not always apply to AI voice clones, with some lawyers arguing that only actual recorded voices are protectable, not clones of voices . This has led to states such as Tennessee introducing legislation to specifically address AI-generated content. The Ensuring Likeness, Voice, and Image Security Act , introduced in 2024, addresses the misappropriation of an individual's voice through generative AI use.

Urgent steps are needed

There has been longstanding scholarly debate about whether Australia should introduce statutory publicity rights.

One of the challenges is overlap with pre-existing laws, such as Australian consumer law and tort law. Policymakers might be hesitant to introduce a new right, as these other areas of the law may provide partial protection. Another challenge is how to enforce these rights if an AI-generated deepfake is created overseas.

Australia could also consider introducing a similar law to the " No Fakes Bill " currently being debated in the US. If passed, this bill would allow people to protect their image and voice through intellectual property rights. This should be given serious consideration in Australia too.

Deepfakes are becoming more and more common, and are now widespread during elections . Because of this, it's important that Australians remain vigilant to them in the lead up to this year's federal election.

And let's hope that whoever wins that election takes urgent steps to better protect everyone's image and voice.

The Conversation

Wellett Potter does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).