Is AI the death of homework? Education in a new landscape

As the education system grapples with the impact of generative AI, experts say it’s about transformation, opportunity – and, of course, risk

Whether educators, parents and students are ready (or not) for generative artificial intelligence (AI), this fast-evolving technology is already reshaping learning methods and traditional approaches to education.

And its impact on one well-known aspect of education – homework – is a force all three will have to reckon with, but experts say it’s premature to declare the death of homework. Instead, it’s better to view this as a transformative moment of opportunity, akin to previous generations of students gaining access to calculators and then smartphones and Google in their pockets. “This is not bad – it’s just a different way of being right,” said Lynn Gribble, Associate Professor in the School of Management and Governance at UNSW Business School. “We need to move from ‘good’ and ‘bad’ to a different way of being.”

The integration of AI into education also comes with substantial risks and represents a paradigm shift that requires careful consideration and adaptation. While generative AI offers exciting possibilities, it also necessitates a re-evaluation of traditional homework and assessment methods, according to Jihyun Lee, a Professor in the School of Education at UNSW Arts, Design & Architecture.

“GenAI will continue to transform the concepts of learning process, learning outcomes, assessment of learning and homework,” she said. “The future of learning will be centred on process-oriented activities, documenting how students learn with and/or without AI.”

Lynn Gribble, Associate Professor in the School of Management and Governance at UNSW Business School.jpg
UNSW Business School's Associate Professor Lynn Gribble, says the greatest risk with gen AI is "trusting the machine” as artificial intelligence is not sentient or all-knowing. Photo: UNSW Business School

The key to making AI work for the education system will be recognising its risks, including significant equity and access concerns. Setting up guardrails is essential – as is redesigning how education, and homework in particular, interacts with technology to ensure students benefit from the presence of AI in their learning environments rather than fall behind because of it. “For whatever good it can do, is there an unintended consequence?” A/Prof. Gribble said. “If it’s always working everything out for you, then the real question is, do you know how to do that work for yourself?”

The new age of homework

While homework may feel like part of the fabric of education – particularly for primary school students – it’s also been contentious for many years, spurred by debate over its utility and purpose. So, what might the advent of generative AI mean for the future of homework in an education landscape where its place was already being questioned?

As A/Prof. Gribble explained, however, ChatGPT and other generative AI tools are unlikely to render homework irrelevant. Instead, the conventional approach to homework will have to transform, capitalising on AI’s opportunities while building systems to deal with its risks.

First, it’s about recognising what homework is designed to do; for example, homework that’s “about skilling and drilling is different from homework that’s about independent study”, A/Prof. Gribble said. “Is it reconfirming what you have learnt during the day, or is it an opportunity for parents to see what their children are learning?”

It’s then about determining where AI can help, using its strengths where it improves things, and keeping the crucial “human in the loop” where that is beneficial. There will be a place for both in the emerging education landscape, including the realm of homework.


“AI has a fantastic potential to explain things that might be outside of your skillset,” A/Prof. Gribble said – and that could be parents trying to help with homework they themselves don’t understand, students trying to optimise their learning processes or educators seeking resources to teach more effectively. AI tools can act as tutors, offering explanations and helping with complex subjects that might be beyond parents’ or students’ own expertise.

Risks and drawbacks

However, the convenience of generative AI comes with a caveat. If students rely solely on AI for answers, they might miss out on critical thinking and research skills. And there are still major risks of data bias inherent in this developing technology.

“It’s not so good if we have a student who knows nothing and says, ‘I used a large language model instead of going to Google’,” A/Prof. Gribble said. “If you go to Google, you’re doing research and then you’re using that research to formulate your own opinion, because it doesn’t give you a narrative. ChatGPT or any Gen AI will give you a narrative.”

Moreover, generative AI is “designed to give you something that’s plausible; it’s never been designed to be truthful or accurate”, she added. It also tends to produce generic, less-nuanced content without “burstiness”, or the vibrancy and individuality of human writing, which is problematic when students must demonstrate deep understanding and unique perspectives.

Read more: Building socially intelligent AI: Insights from the trust game

“So, it is transforming homework in terms of parents or laypeople being able to help their young charges,” A/Prof. Gribble said. “The drawback is that if I don’t know what I don’t know, and I just accept what the AI says as the truth, the whole truth and nothing but the truth, then I may actually be learning the wrong thing.

“The greatest risk is, ‘I trusted the machine’,” she added. “Never trust the machine – it’s not sentient, it’s not all-knowing; it’s trapped in a moment of time. Even Google is: if there was an earthquake, it would take five minutes for Google to catch up.”

Equity concerns: the digital divide

Another essential consideration is that not all students have equal access to technology, so its elevation in education contexts can exacerbate existing inequity. “There are some people who don’t have access to the Internet and a computer at home,” as A/Prof. Gribble noted. “Education is bigger than what happens in a classroom, right?

An essential consideration with the use of AI is that not all students have equal access to technology.jpeg
An essential consideration with the use of AI is that not all students have equal access to technology, which could potentially exacerbate existing inequities. Photo:

“Equal access is problematic because the government has not provided free Internet access. If you come from a family that values education, it’s easier than if you come from one that doesn’t,” she added.

Relying on schools to provide access is also problematic: “You would find that schools in socioeconomically disadvantaged areas are struggling to put textbooks in the room, let alone computers and Internet access.”

This digital divide can disadvantage students from lower socioeconomic backgrounds. “Like other resources, equal access to AI tools will be a continual issue,” Prof. Lee said. “If equal access to educational resources is not guaranteed in Australian society, we can only expect that the existing gap of digital disparity will be widened with additional types of digital technologies.”

The role of educators and parents

For educators, a key challenge in the evolving technological landscape will be integrating AI in a way that complements rather than replaces traditional teaching methods. And, crucially, educators must focus on inspiring and engaging students in ways that AI cannot.

Read more: Charting a new course for university education in the age of ChatGPT

“What we need to make sure as educators is that we are the storytellers; that we are the people inviting people to see how knowledge, knowledge application and critical thinking – being able to unpack assumptions – makes the world a better place,” A/Prof. Gribble said.

This approach includes fostering critical thinking and understanding the broader implications of students’ learning. “Students need to recognise the different skill-sets” that will matter in an AI-assisted world, A/Prof. Gribble said, noting that while AI can assist in many tasks, it is not a substitute for human creativity and insight.

“Question. Fact-check. Where’s the human in the loop? Where are the morals and the ethics – is this what a good person would do or say? It’s virtue-based care ethics, coming from a space of, does somebody get harmed in this?” she said. “You can’t just say the machine’s responsible – we can’t pass off that responsibility. Humans must recognise their role in humanity and the humanness of it.”

For parents, understanding AI is crucial to effectively supporting their children. “Although it may seem obvious, parents should be aware of which AI tools their children are using, as well as how and for what purposes,” Prof. Lee said.

A/Prof. Gribble said parents should “stop, and then go. ‘Do I really understand AI?’ And if you don’t, learn it now, not next week; stop and take a basic AI 101 course; understand what AI is and how it works. And then you can support your children.”

The government can also play an important role in the use of AI by setting policies and establishing guardrails.jpeg
The government can also play an important role in the use of AI by setting policies and establishing guardrails to protect children and the education system. Photo: Adobe Stock

The challenges of regulation

There is also a role for the government to set policies and establish guardrails to protect children and the education system, but this comes with its own challenges. And with generative AI already changing education, policymakers are increasingly handicapped in their ability to react to risks and opportunities it presents.

In addition, “government policies and regulation are inherently value-laden processes,” Prof. Lee said. “This means that public backing is necessary when labelling certain things as ‘bad for you’. Public backing should also be somewhat unanimous and rely on common sense. Since public use of AI is new and its full impact is not yet known, establishing AI-related policies may be challenging. For example, the notion of AI being inherently ‘bad’ is unlikely to gain the public consensus as with clearly harmful products like cigarettes.”

A/Prof. Gribble agreed that prohibition would not work, meaning the education system must work around these technological developments. “We need to instead come from an education, understanding, inclusivity, responsibility, sustainability perspective, and when we do that, the world changes,” she said. “Regulations and laws, along with policies, need to consider the intended and unintended consequences, and big business needs to step up and do what it can to ensure that the right things happen not just follow the law alone.”

Read more: Three useful things for educators to know about ChatGPT

In Australia, the federal government recently released a report following an inquiry into the use of generative AI in the education system, making 25 recommendations for schools, government and other stakeholders to manage gen AI’s risks and opportunities. These include making the use of GenAI in education a national priority, creating safeguards and mandatory guardrails, taking steps to ensure equal access, regulating EdTech companies, and integrating AI literacy in school curricula.

An era of authenticity

In this evolving landscape, the focus should be on leveraging AI to enhance education while ensuring it does not undermine essential skills and critical thinking. By embracing both the benefits and challenges of generative AI, educators, parents, and students can navigate this new era of learning more effectively, ensuring that technology serves as a valuable complement to human ingenuity rather than a replacement.

“Graduates and students need to recognise the different skill sets, the same as with art or music,” A/Prof. Gribble said. “Can AI produce art and music? It can, but it’s not the same. We need to recognise there’s a place for both – it’s not an either/or; it’s now an ‘and’.”

She advocated for a shift to homework practices that encourage deeper engagement and application of knowledge rather than mere retrieval. “We need to move from asking students just to look something up or write something to get them to do something with what they know,” A/Prof. Gribble said. “I would hope that universities, too, will move on this more participatory way of thinking, ‘how do I demonstrate knowledge?’”

Subscribe to BusinessThink for the latest research, analysis and insights from UNSW Business School

For instance, she has incorporated the reality of Generative AI in how she assesses students, asking what a student from her class should be able to do with the knowledge obtained there. “We need to understand at the core of a discipline or practice what it is to be able to do that well.”

These shifting priorities mean the next 10 to 15 years will likely be an era in which authenticity is at a premium, A/Prof. Gribble added. That’s why there will always be space and a need for humans in the loop, and why parents and educators should continue fostering the human aspect of education. “It will be about having conversations, coming back to the dinner table and talking about, ‘What did you learn today?’”

Republish

You are free to republish this article both online and in print. We ask that you follow some simple guidelines.

Please do not edit the piece, ensure that you attribute the author, their institute, and mention that the article was originally published on Business Think.

By copying the HTML below, you will be adhering to all our guidelines.

Press Ctrl-C to copy