New AI guidance for schools
AI toolkits for schools, students' perceptions of AI, AI literacy in middle schools, social media time limits, decarbonising websites, the power of play, Pac-Man and much more...
What’s happening
This week the UK government published its support materials for using AI in education settings, which Michelle worked on with the Chiltern Learning Trust, led by Christian Turton. The resources are aimed at all educators from early years to further education, with an additional toolkit for school leaders, and they focus on the safe and effective use of AI. There are video presentations, all with full transcripts, plus summaries, activities and a workbook for each module. The toolkits are suitable for use with groups, such as in-school CPD sessions, or can be worked through solo.
We’ve placed a very heavy emphasis throughout the resources on the need to always have a human in the loop when using AI in education settings and stress that while using AI can save time (seriously, who really wants or needs to write another ‘there have been cases of headlice in the school’ letter from scratch?), it always needs human oversight and there are situations and contexts where it should not be used at all. We cover the basics of what generative AI, machine learning and LLMs are and do, and discuss the risks of anthropomorphising AI tools. We also look at the essential areas educators need to be aware of when using generative AI, from data protection to copyright, online safety to critical thinking.
Despite the BBC news story headline, we don’t suggest using AI for marking but in one of the modules we offer a whole raft of use cases where AI might play a helpful role, such as in adapting resources for learners with special educational needs. For example, text-to-text large language models can quickly break down complex tasks into chunks, listing or bullet pointing the task, to make them more accessible for some learners.
Scott Hayden, head of teaching, learning and digital at Basingstoke College of Technology, offers a great example of how an automotive teacher uses an LLM to make his lessons more accessible to a student who has caring responsibilities:
“He is able to generate podcast overviews of his slides, his handouts and the videos he creates in the workshop. And one learner in particular, who can't always be present in the lessons, because he's a young carer and has a part-time job, is able to listen to those podcast overviews when he is travelling to his part-time job or perhaps on the bus home from college. So the idea of adapting and personalising resources is particularly interesting.”
There are more practical examples throughout the resources, along with some great mini-interviews with leading thinkers and practitioners in the field, such as Rose Luckin, Miles Berry, Scott Hayden, Jane Waite, Matthew Wemyss and Chris Loveday.
The resources are available on the DfE website, as well as being sent to all schools, and there’s more coverage in Schools Week - AI guidance for schools: 9 key findings for leaders - and TES - DfE sets out how schools can ‘safely’ use AI.
AI roundup
National AI skills drive
As well as our DfE AI resources, London Tech Week saw a slew of other government announcements relating to AI and education. The Prime Minister launched a national skills drive, which included “TechYouth”, a three-year, £24 million programme of government funding to give one million students across every secondary school in the UK the chance to learn about technology and gain access to new skills training and career opportunities. There will also be an online platform to educate students about the potential of computing and tech careers, along with investment in a national skills programme to bring digital skills and AI learning into classrooms and communities. Meanwhile, the science, innovation and technology secretary, Peter Kyle, pointed to the benefits of AI for dyslexic students. He himself is dyslexic, has found AI to be a helpful tool and argues it could ‘level up’ education for dyslexic children.
According to the report, Kay Carter, the chief executive of the Dyslexia Association, said AI was already levelling the playing field for dyslexic pupils. If AI could manage tasks such as memorising facts and rapid recall of information, “the focus [of education] may shift to problem-solving [and] critical thinking, talents which some of those with dyslexia naturally excel at,” she said. She cautioned that AI was not to be a replacement for good teaching but “allows dyslexic students better access to their own learning”.
AI and the Future of Education webinar recording
Last week we highlighted the AI and the Future of Education: Disruptions, Dilemmas and Directions roundtable and its international group of speakers – Bryan Alexander Helen Beetham, Laura Hilliger, Ian O'Byrne, Karen Louise Smith and Doug Belshaw. They had each responded to UNESCO’s call for critical reflections that consider the longer-term educational implications of AI, and all their individual contributions are available here. The recording from the webinar is now available.
Students' perception of AI in 2025
Thanks to Michelle Cannon, lecturer in digital arts and media in education at UCL, for sharing this insightful Jisc report on students' perception of AI in 2025; how students view artificial intelligence, how they are using it (as part of everyday life), and their concerns and hopes. The ‘emerging issues’ are particularly interesting:
Some students noted that over-reliance on AI was starting to have a negative effect on the quality of their work, causing them to re-evaluate its use.
Many students are starting to report a feeling of anxiety about the speed of AI developments.
Students are starting to be concerned that their data would be used to predict their behaviours on a larger scale in future.
As deepfakes become ever more realistic, student concern about their impact on the world is growing.
Some students are starting to report that they are using AI applications for relationship and mental health advice and support.
Show me how you think
The first of Jisc’s emerging issues is explored in more depth in a post by Viktoria Mileva. She looks at how AI tools make it easy for students to outsource tasks that once required deep engagement, such as researching, analysing and synthesising information. This risks students increasingly relying on AI to do the thinking for them, which reduces their own cognitive effort and engagement with complex problems. She proposes a move from knowledge-recall assessments to tasks that require students to show their thinking process and decision-making using AI, for example showing annotated prompts, reflection journals and oral defences. Mileva signposts these AI literacy frameworks:
Rachel Horst’s dimensions of AI literacy, which sees AI as part of an entangled landscape of learning, technology and society, and David Winter’s framework, which shows how AI can support your thinking, not just replace it.
Perspectives on AI literacy in middle school classrooms
Our colleague Katarina Sperling is one of the authors of a newly published review of AI Literacy in Middle School Classrooms: exploring international approaches to AI literacy for 9 to 13 year olds. The paper shows that AI literacy is emerging “as a new form of civic knowledge” and, like digital literacy in general, AI education shouldn’t place the responsibility of this complex technological and societal system on individual students and teachers – it should be a collective responsibility.
Quick links
Schools and colleges are invited to apply to participate in a DfE EdTech Impact Testbed Pilot to evaluate innovative educational technologies. The initiative aims to assess how new tools can enhance teaching and learning in schools and colleges across England.
The government considers social media time limits for children.
Spending review 2025: What’s in it for schools? Schools Week has a summary.
We’ve always liked Scott Stonham’s work on digital carbon footprints for Jisc. Here’s a recording of a lunch and learn event he held – a journey towards a low-carbon website: ‘decarbonising one website at a time’.
Learn to code schemes backfire as computer science graduates face unemployment, says this article in Futurism (thanks Ben Williamson for sharing).
Everything to play for is a new report by the Play Commision advocating for a rights-based approach to ensure children can play. Pair with this story of an Oxford primary school bringing play back into the classroom and How school became more sedentary - and why it matters (more on play next week!).
When children and young people's reading for enjoyment is at an all-time low how can we support a love of reading? A new report from the National Literacy Trust proposes solutions.
We’re reading, listening…
Hidden spaces and dangerous places
In her new book Logging Off: The Human Cost of Our Digital World, Adele Zeynep Walton says it's time to reclaim online spaces. In Radio 4’s Start the Week she explores how the price of the connections and conveniences of online life has been the mental health of a generation. She says that social media platforms and digital technology are making us vulnerable and it is time these spaces were governed and regulated.
Against technofeudal education
Matt Seybold warns of the risks posed by edtech business models and ambitions in the education sector and highlights the alignment of many companies with reactionary political forces. He calls on educators to take collective and individual action to protect educational spaces from technofeudal capture, emphasising solidarity, awareness and practical resistance to the encroachment of surveillance-driven, profit-oriented edtech.
Give it a try
The history of Pac-Man
Need cheering up? Take a look back at Pac-Man’s long history (thanks to Cliff Manning’s More Than Robots newsletter)
Connected Learning is by Sarah Horrocks and Michelle Pauli