What do Scottish teachers really think about using artificial intelligence in schools?© The use of AI is being considered by Scottish education authorities.

Nearly half of Scottish educators fear artificial intelligence (AI) will undermine the integrity of pupil assessments, a new survey has found.

The study of more than 500 teachers, lecturers and school leaders revealed mixed feelings about the use of the emerging technology, with 57 per cent excited about the potential for a positive impact on education, but 61 per cent expressing concerns.

Only just more than half of those surveyed believed the current stance of the Scottish Qualifications Authority (SQA) was appropriate for now, while a significant majority believed it would not be suitable in the long term.

Concerns have been growing about the opportunities for plagiarism created for pupils and students through the use of generative AI technology, such as ChatGPT or Google Bard. However, some staff have been embracing the tools, with the technology being encouraged by local authorities and schools in some areas of Scotland.

The SQA’s current position is learners must not submit AI outputs as their own work, and they can not reference AI as a source. It has now been announced this policy will continue for the next school year.

A survey was carried out by the SQA in November and December, with 519 respondents taking part.

These included 118 lecturers and 151 teachers, with 49 identifying themselves as principal teachers, 29 as heads of department or faculty heads, 23 as curriculum managers, 20 assessors or verifiers and 19 as senior leaders.

Almost a quarter said they already used AI regularly at work, for the likes of lesson planning, designing materials and administrative tasks.

Asked to select the most urgent issues that needed to be addressed for generative AI to be suitable for widespread use in Scottish education, thetop three choices were the ability to detect AI-produced content, a personal lack of understanding or knowledge, and a lack of guidance from the education system.

One of those surveyed said AI was “great for coming up with lesson ideas, quizzes, homework exercises and cuts down hugely on workload”.

Another said: “I have used it to proofread, find typos and fix them, change the localisation of the language, translation for parents who have English as an additional language.

“I have used it to make language more concise, to differentiate for pupils, generation of quizzes, production of flashcards. I have used it to write articles for the school website and for Twitter. I have also experimented with prompts for report writing, although have never used the output in a report.”Others reported using the technology to assist with grading and feedback, writing references and to detect plagiarism.

While 79 per cent of respondents felt learners should be prepared for a future where AI skills are valued, 18 per cent believed they should be actively discouraged from using AI tools, and 49 per cent agreed that use of AI by learners would undermine assessment. “I have been using AI for a while now and it is without a doubt a powerful tool with the capability to transform education for the better,” one respondent wrote.

“That said, its implementation within the classroom should be given due care and consideration as there is also a real risk that developmental harm could be done, such as stunting the development of core metaskills.”

A teacher said: “I am concerned that teachers use AI to set tasks, pupils use it to generate answers and nobody has actually taught or learned anything. As an English teacher, I am concerned about assessment of writing and how we verify what is a pupil’s own work.”

Another asked: “How will the SQA monitor this going forward? The situation will potentially become impossible to police, leading to wholesale abuse of exam/assessment arrangements and therefore devaluing any qualification associated with the process.”

It was suggested that qualifications may need to be adapted to ensure their continued integrity, while many called for new training, guidance and policies to support teachers and lecturers.

The research has emerged as the Scottish Government considers whether to reduce the number of exams sat by pupils, as was recommended in the Hayward report, with a greater emphasis on coursework.

One survey respondent said: “I think the SQA need to act quicker on this. The integrity of qualifications is at risk if students are able to write essays and submit them as part of a folio for external SQA assessment.

“The guidance alone will not stop students using AI. What else can be done to address this? The only way I can see is to remove written folios and move to exams.”

SQA senior researcher Jamie Lawson said: “The findings are in line with other similar pieces of research that have been conducted elsewhere, this sense that overall practitioners are taking a position that AI is here, that it is going to change the game in some way, and that we need to be prepared to handle that.

“And, you see it in our research and the other similar pieces of research, the division between practitioners who for the most part are very positive about it and the impact it can have on their work, and some, a smaller number, but a reasonable amount of practitioners, who are very cautious about it, and are slightly resistant about incorporating it into education.”

Martyn Ware, director of policy, analysis and standards at SQA, said: “Our stance continues to reflect similar positions from assessment specialists and other awarding bodies from across the UK.

“Going forward, however, we will continue to monitor developments in GenAI over the coming year, and tailor our approach to the unique needs of our awarding principals.

“We will work our partners across Scottish education, supporting teachers and lecturers where we can, listening to their views and sharing their insights, so we can work effectively together to protect the credibility of assessments and qualifications.”