After becoming mainstream in 2023, generative artificial intelligence (AI) is now transforming the way we live.
This technology is a type of AI which can generate text, images and other content in response to prompts. In particular, it has transformed the way we consume and create information and media.
For example, millions of people now use the technology to summarise lengthy documents, draft emails and increase their productivity at work. Newsrooms have also started experimenting with generative AI, and film companies are using it to create actor digital doubles and even "digital clones" of actors who have died.
These transformations are bound to increase in the coming months and years. So too are the many concerns and controversies surrounding the use of generative AI.
In the face of these complex and rapid developments, we surveyed more than 4,000 Australians to better understand their experiences with and attitudes toward generative AI. Released today, our results paint a complicated picture - and underscore the vital importance of improved media literacy programs.
Who is using generative AI in Australia?
Between January and April this year we surveyed a representative sample of 4,442 adult Australians. We asked people a range of questions about their media use, attitudes and abilities including a series of questions about generative AI.
Just under four in ten (39%) adults have experience using text-based generative AI services such as ChatGPT or Bard. Of this group, 13% are using these services regularly and 26% have tried them.
An additional three in ten (29%) adults know of these services but have not used them, while 26% are not at all familiar with these services.
Far fewer Australians are using image-focused generative AI services such as Midjourney or DALL-E. These kinds of services can be used to create illustrations or artworks, adjust or alter photographs or design posters.
Only 3% are using these services regularly and 13% have experimented or tried using them. Half (50%) of adults are not at all familiar with image-based AI services while 28% have heard of these services but have not used them.
Some groups are much more likely to be using generative AI.
Regular use is strongly correlated with age. For example, younger adults are much more likely to be regularly using generative AI than older adults. Adults with a high level of education are also much more likely to be using this technology, as are people with a high household income.
Australians are worried about generative AI
Many Australians believe generative AI could make their lives better.
But more Australians agree generative AI will harm Australian society (40%) than disagree with this (16%).
This is perhaps why almost three quarters (74%) of adult Australians believe laws and regulations are needed to manage risks associated with generative AI.
Just one in five (22%) adults are confident about using generative AI tools, although 46% say they want to learn more about it.
Significantly, many people said they don't know how they feel about generative AI. This indicates many Australians don't yet know enough about this technology to make informed decisions about its use.
The role for media literacy
Our survey shows the more confident people are about their media abilities, the more likely they are to be aware of generative AI and confident using it.
Adult media literacy programs and resources can be used to increase people's media knowledge and ability. These programs can be created and delivered online and in person by public broadcasters and other media organisations, universities, community organisations, libraries and museums.
Media literacy is widely recognised as being essential for full participation in society. A media literate person is able to create, use and share a diverse range of media while critically analysing their media engagement.
Our research shows there is a need for new media literacy resources to ensure Australians are able to make informed decisions about generative AI. For example, this kind of education is crucial for adults to develop their digital know-how so they can determine if images are real and can be trusted.
In addition, media literacy can show people how to apply critical thinking to respond to generative AI. For example, if a person uses an AI tool to generate images, they should ask themselves:
- why has the AI tool created the image in this way and does it create social stereotypes or biases?
- could I use a different prompt to encourage the AI to create a more accurate or fairer representation?
- what would happen if I experimented with different AI tools to create the image?
- how can I use the advanced features within an AI tool to refine my image to produce a more satisfactory result?
- what kind of data has the AI been "trained on" to produce this kind of image?
Without interventions, emerging technologies such as generative AI will widen existing gaps between those with a low and high level of confidence in their media ability.
It's therefore urgent for Australian governments to provide appropriate funding for media literacy resources and programs. This will help ensure all citizens can respond to the ever-changing digital media landscape - and fully participate in contemporary society.