Generative artificial intelligence (GenAI) is on the verge of transforming education, and homework will be no exception, with some even ringing its death knell. But experts say that's premature, as it's more likely to transform how homework works for education.
That transformation will likely include a shift to homework practices that encourage deeper engagement and application of knowledge rather than mere retrieval, according to Lynn Gribble, Associate Professor in the School of Management and Governance at UNSW Business School. She says GenAI represents an opportunity to repurpose homework as a more efficient vehicle for preparing students for a technologically advancing world.
"We need to move from asking students just to look something up or write something to getting them to do something with what they know," she says. "We need to understand at the core of a discipline or practice what it is to be able to do that well."
Risks and guardrails
The integration of GenAI into education comes with substantial risks and represents a paradigm shift that requires careful consideration and adaptation. Recognising this, the federal government recently released a report, following a parliamentary inquiry, with 25 recommendations for managing GenAI's risks and opportunities, including making the use of GenAI in education a national priority, taking steps to ensure equal access and integrating AI literacy in school curricula.
According to Jihyun Lee, a Professor in the School of Education at UNSW Arts, Design & Architecture, GenAI can be an "excellent assistant" for those willing and able to use it. But it's already creating challenges for educators, particularly in these early days as the technology develops.
"Thus far, with the uncertainty and less-than-perfect AI performances, the workload of teaching professionals to address the AI impact has increased," she says.
"For example, many educators have reverted to in-person, paper-and-pencil tests in the classroom. I am not sure if AI can handle routine tasks accurately without close human supervision," she says. "Researchers have also shown that AI increases intimidation and cognitive load for lower-ability students, and thus GenAI in the current form is not useful for every student."
Introducing this technology into classrooms, and particularly homework practices, comes with other substantial risks, including significant access and data bias concerns. And from a learning standpoint, if students rely solely on AI for answers, they might miss out on critical thinking and research skills.
And as A/Prof Gribble says, generative AI is "designed to give you something that's plausible; it's never been designed to be truthful or accurate".
Citing GenAI's risks and the early stage of its development, the parliamentary report recommended the government "create safeguards for all users, especially minors, monitor current pilot programs and evaluate the different approaches to using GenAI education tools in schools, including as a study buddy".
The report also called for working with key partners to promote fit-for-purpose GenAI tools based on the Australian curriculum, with local and inclusive inputs. Other recommended safeguards included providing GenAI literacy and training to educators, students, parents and guardians, and policy makers; working with the eSafety Commissioner in supporting educators on how to use GenAI ethically, safely and responsibly; and identifying unacceptable risks in the education sector.
Insights for educators
One way for educators to retain value from homework in a GenAI-assisted reality is to refocus homework and other assignments to minimise rote learning tasks and prioritise critical thinking, Prof. Gribble says. That may mean allowing or even encouraging students to use GenAI but requiring them to demonstrate what they learned from that process. And these tools can act as tutors, offering explanations and helping with complex subjects that might be beyond one's expertise.
A/Prof. Gribble considers what work a student from her class should be able to do, and focuses assignments on building those skills, even if students use GenAI to do it. "I want a student to be able to explain how context impacts organisations, and how it impacts them in organisations," she says.
This will require "real, face-to-face, authentic assessment and evaluation, and getting students to show us the process of how they might do something," she says.
While essays, reading, problem sets and other at-home learning methods will likely change in nature, they need not become obsolete. Instead, educators can set engagement rules that enable students to use GenAI as a tutor or 'reading buddy' but require them to demonstrate understanding during in-class discussion, or asking them to critique the AI and check for errors or provide counterpoints.
Key challenges ahead
Parents and guardians should also be prepared to support their children in a GenAI era by understanding these evolving technologies, as well as school expectations and policies around how GenAI is used at home, Prof. Lee says.
"Although it may seem obvious, parents should be aware of which AI tools their children are using, as well as how and for what purposes," she says.
For students, an important part of learning will be recognising the different skill sets that will matter in an AI-assisted world, A/Prof. Gribble says, noting that while GenAI can assist in many tasks, it is not a substitute for human creativity and insight.
"Question. Fact-check. Where's the human in the loop? Where are the morals and the ethics – is this what a good person would do or say?" She says that the challenge for education will be integrating GenAI in a way that complements rather than replaces traditional methods, emphasising the human element of both teaching and learning.
"What we need to make sure as educators is that we are the storytellers; that we are the people inviting people to see how knowledge, knowledge application and critical thinking – being able to unpack assumptions – makes the world a better place," A/Prof. Gribble says.