Three useful things for educators to know about ChatGPT
Students require the ability to discern between what AI tools like ChatGPT can produce and the critical thinking necessary for good-quality work
What role should AI tools like ChatGPT play in education? And how can educators use AI tools to develop essential human skills like critical thinking and problem-solving? From enhancing students’ learning experiences to creating tools that can detect AI-assisted cheating, the education sector is perhaps one of the first big industries to ask these questions in an honest, tangible way.
How educators should tackle AI-assisted cheating while ensuring students access the latest technology to accelerate their learning was discussed during a recent virtual pert panel held by UNSW Sydney. The UNSW expert panellists included Sam Kirshner, Associate Professor in the School of Information Systems at UNSW Business School, and Cath Ellis, Professor in the School of Arts and Media.
1. Is the risk worth the reward when it comes to ChatGPT?
What makes ChatGPT so impressive is its ability to explain complex information clearly and succinctly, even with nuanced and contested ideas. Unfortunately, while this is undoubtedly a pro of the tool, it also leads to one of its most significant risks.
“A lot of the things we ask students to do to demonstrate to us their learning, their knowledge, and their skills and their ability to apply that knowledge and skills, they can now ask a tool to do for them,” said Prof. Ellis. “Most of the time, it'll do a good enough or better job of it. And most of the time, it'll get it right. That's the simple fact of it.”
This means teachers may find it increasingly difficult to tell the difference between what has been written by AI and what has been written by a student, which she said may increase the risk of “false positives and false accusations of academic misconduct”.
At the same time, history is littered with examples where new tools have entered the education space, and while they also come with their own set of risks, they have also accelerated learning.
Read more: AI: friend or foe? (and what business leaders need to know)
“I think it goes beyond just the idea of assessment and cheating,” said A/Prof. Kirshner. “There's been a very slow and gradual progression of technology entering the education space. First, with the wide availability of information on the internet, including websites like Wikipedia, which have pretty much made the textbook market completely obsolete,” he said.
Now, with AI tools like ChatGPT, he said, not only is there less of a reason to buy expensive textbooks each term but there are also increasingly fewer reasons to attend University if the prospective student's end goal is accumulating knowledge and information.
“Take the example of learning Python. Even over the last ten years, if I wanted to learn Python, I didn't need to go to UNSW, or another university, to do a computer science degree. I could go on YouTube, I could use LinkedIn learning or countless other ways of receiving a great education on that set of skills,” continued A/Prof. Kirshner.
“With COVID and moving to Zoom, our lecture attendance is now going from around 75 per cent, or 80 per cent, in week one down to about 10 per cent by the end of the course, mainly because there’s not much additional value in a lecture beyond what you can get by just watching the recording. This is the straw that’s going to break the camel’s back in terms of assessing and how we approach education more broadly; we need to critically reflect on what we are doing as a university and where we want to go.”
2. ChatGPT still requires discernment and human judgement
While ChatGPT is an impressive tool, it isn’t perfect. It may be convincing, but not always true. This means that students need to be able to distinguish between good work and work that isn't there yet, and they only do this by truly having the skills to do it from the ground up.
The ability to discern and think critically will be a crucial skill going forward, explained Prof. Ellis. “Pretty much every piece of media I've done, the journalist has had a go at using it to write the intro or something like that. But every time somebody's done that, they've looked at it and said, what it did was pretty good, or it was good, or no, it wasn't good enough,” she explained.
“The media is full of instances of professionals reporting on how they’ve asked ChatGPT to do their job for them – including the journalists themselves getting it to write an introduction to a piece. But every time someone’s done that, they’ve been able to look at it and judge the quality: saying ‘it was good’ or ‘it was terrible’ or ‘it was okay but not great,” explained Prof. Ellis.
“But the thing is, all those individuals already had the evaluative judgement skills to know the difference between not yet good enough and good enough, and good. Before the evaluative judgement skills, you have to have quite a good knowledge of what the difference between not good enough, just good enough, or good enough looks like to write good prompts to get it to do what you want.”
So there remains a gap between what the AI chatbot can produce and the critical thinking required to assess its responses. And this is perhaps where the value of higher education can shine.
3. Assessing students’ processes versus results
With new tools likely being released that can tell whether an AI chatbot crafted something, the ability to write succinctly and well still requires some level of human judgement. But there is also the flip side of this fantastic tool just sitting there, so why shouldn’t students use it?
One possible way to ensure AI tools like ChatGPT enhance education might be to focus more on the process of how students get answers, not just the results. This was an idea proposed by A/Prof. Kirshner who said he was planning to use the chatbot in classrooms. He said that educators would really have to ask themselves how they will assess learning and how they want to educate in the future.
“You can get this tremendous head start and then really refine and be persuasive and make good arguments,” he said. “Or, you could be like, we don't care about the end result; all we care about is your process… your reflections, and what you end up creating… whether it's a report or some type of artefact that doesn't matter.”
Should educators use AI tools to accelerate learning? Prof. Ellis said: “We're asking students to climb Mount Everest here. University is hard, and a degree in our discipline will be difficult. But do we need to know that the students can get to base camp under their own steam every time?
Subscribe to BusinessThink for the latest research, analysis and insights from UNSW Business School
“Maybe we only need them to show us they can get to base camp by trekking once. And then, after that, they can get the helicopter to base camp,” she said.
Augmenting the experience of education and accelerating learning through tools like ChatGPT is something that A/Prof. Kirshner plans to do this in his own lessons, with the aim to meaningfully incorporate these tools into the curriculum to give students the best learning experience possible.
“If before the journey was climbing Mount Everest, maybe now we're going to the moon,” he said.
Cath Ellis is a Professor in the School of the Arts and Media, and Sam Kirshner is an Associate Professor in the School of Information Systems at UNSW Business School. For more information about the impact of AI across different sectors, visit the UNSW AI Institute, the flagship UNSW Research Institute in artificial intelligence, data science and machine learning.