AI in assessments: Will it revolutionize education or threaten academic integrity?

One example, from a final-year assessment, reads: “Why is it important to understand the timing of exercise in relation to nutrition status in people with [a technical term, according to James] overweight?”

And there are tell-tale signs the answer given by ChatGPT was not written by a student.

As the exam season starts, universities are being encouraged to teach students how to use artificial intelligence (AI) tools to give them an edge in assessments. The University of Bath is considering the challenges and opportunities that AI applications like ChatGPT offer. Although ChatGPT handles multiple-choice questions very well, it struggles with more complex questions that require critical thinking. Additionally, the tool repeats the exact phrasing of the question in its introductions and conclusions and makes up fake sources for academic work, which could easily fool students.

The Quality Assurance Agency, which reviews standards at UK universities, has advised institutions to equip students with AI skills that can help them in the workplace. It is essential that universities explain to new and returning students in September how and when AI tools such as ChatGPT should be used, according to the agency. The tool is a new addition to students’ toolboxes and can help students generate ideas for assignments, especially neurodivergent students and those for whom English is not their first language.

Marketing lecturer Kim Watts called ChatGPT “another tool in the toolbox” that can help get “students started on things”. However, she also stressed that the tool should not replace critical thinking, and submitting work that has been produced by ChatGPT would not show any learning or critical thinking. Universities must adapt their courses where appropriate to teach students how and when to use AI tools.

Re-reported from the story originally published in https://www.bbc.com