Universities at a crossroads:- Universities should be having an active conversation with their students and staff about how to capitalise on the benefits and minimise the drawbacks of using AI tools, such as ChatGPT, according to a new paper from the University of Surrey.
The study recommends that institutions address unethical behaviours caused by using AI tools, such as cheating or becoming overly reliant on technology, through clear regulations around the use of AI. In addition, it suggests that investments in both staff and student training can allow universities to take advantage of these technological breakthroughs.
Professor Sorin Krammer, author of the paper from the University of Surrey's Business School, said:
"The rise of large language models (LLMs) leaves higher education facing two choices – resisting AI or embracing the technology and adapting to it. I am afraid that it is already clear that one of these choices is unsustainable, as the digital genie has already been out of the bottle since early 2023.”
"Young people all over the world are embracing this technology, which makes universities duty-bound to develop strategies, policies and educational environments that make the use of this technology both safe and enjoyable, without hindering academic integrity or personal growth."
Other recommendations from Professor Krammer's paper include:
Development of formal academic policies: Universities should establish new overarching policies regarding the use (or prohibition) of AI in education. This may include planning in infrastructure, curricula development, and staff training to effectively include AI tools.
Maintain strong ethical standards while allowing for diversity in employing AI: Strong enforcement of penalties for AI-related academic misconduct is a must. Considering a combination of traditional and new assessment methods to reduce AI cheating may be a way forward. However, smaller size classes can explore more customization of assignments.
Collecting scientific evidence: More empirical research is needed to better understand and quantify the benefits and pitfalls of AI technologies in education. Focus on finding the appropriate role for AI in educational delivery, complementing the strengths of human instructors.
The study, Is there a glitch in the matrix? Artificial intelligence and management education, has been published in Management Learning.
The University of Surrey is also examining the opportunities for AI-driven marking tools to enhance student learning outcomes and free up academic time for teaching and research – and has developed and launched an innovative tool called KEATH.ai (Key Evaluation and Assessment Tools Hub). The AI tool quickly grades student work and gives feedback, allowing academics to focus on more value-adding feedback, additional time to interact with students, and research work. AlphaGalileo/SP