AI: 'more human interactions the prize'
AI principles, AI as exam tutor, Pac-Man clones, Roblox danger, heads on social media, media literacy, the murky world of Facebook, and much more...
What’s happening
We’ll get quickly into the news, links and resources this week, following super-long newsletters over the last couple of weeks, which featured insights from Emma Darcy and our report from UCL’s digital futures and inequalities conference (which has now been published as a blog post by UCL).
But first, just a very quick roundup of some UK government activity around AI this week. DfE minister Stephen Morgan let the TES know the “latest government thinking about AI in education”, namely
“a shared understanding that technology in education should be focused on automating tasks, with more human interactions the prize.”
He details the various projects on the go. These include the AI Product Safety Expectations in Education framework, bespoke AI tools for education, and the teacher and leader toolkits focusing on safety and effectiveness that Michelle is working on. He emphasises the need to ensure that all schools have equal access to avoid increasing the digital divide.
Meanwhile, UK PM Keir Starmer has been setting out his plans for the civil service to use AI more extensively, with government officials to be told that:
“No person’s substantive time should be spent on a task where digital or AI can do it better, quicker and to the same high quality and standard.”
Last month the government published its AI Playbook for UK Government, and we’re interested in its 10 core principles for AI use in government and public sector organisations. Not all will apply equally strongly to education but they seemed to us to be a useful starting point for heads and other education leaders looking to develop AI policies:
Principle 1: You know what AI is and what its limitations are
Principle 2: You use AI lawfully, ethically and responsibly
Principle 3: You know how to use AI securely
Principle 4: You have meaningful human control at the right stage
Principle 5: You understand how to manage the AI life cycle
Principle 6: You use the right tool for the job
Principle 7: You are open and collaborative
Principle 8: You work with commercial colleagues from the start
Principle 9: You have the skills and expertise needed to implement and use AI
Principle 10: You use these principles alongside your organisation’s policies and have the right assurance in place
Finally, a Freedom of Information request has revealed what Peter Kyle, the science and technology secretary, has been asking ChatGPT.
AI roundup
AI as exam tutor
In this LinkedIn post, Darren Coxon looks at how Google can talk through an A Level Economics paper, showing how to answer each question. It’s a more user-friendly alternative to simply uploading PDFs to an AI model and you can chat to the AI system in real time while sharing your screen
Darren says, “It's this user-friendliness, coupled with the AI's engaging, supportive tone, that makes it such a threat to the entire exam prep industry. Students can chat through anything they need support on, and Google AI Studio doesn't even need to be explicitly taught to take on the role of a tutor - it does it by default. It's still a little slow, doesn't always read PDFs perfectly (enlarging them on screen helps), and is for over-18s only right now. But this will all change (and let's face it, under-18s are already using these tools as they've seen them demo'd on TikTok).”
Infinite monkeys
OpenAI says it has trained an AI that’s ‘really good’ at creative writing and novelist Jeanette Winterson has weighed in with a take that has surprised many. Responding to Sam Altman's example of AI-generated fiction, a short story prompted with 'Short Story. Metafiction. Grief', she writes:
“What is beautiful and moving about this story is its understanding of its lack of understanding. Its reflection on its limits…Humans will always want to read what other humans have to say, but like it or not, humans will be living around non-biological entities. Alternative ways of seeing. And perhaps being. We need to understand this as more than tech. AI is trained on our data. Humans are trained on data too – your family, friends, education, environment, what you read, or watch. It’s all data. AI reads us. Now it’s time for us to read AI.”
New job anyone?
The Raspberry Pi Computing Education Research Centre is looking for research associates to investigate how young people learn about AI, introducing the subject at the school level, and what are the core concepts, pedagogical strategies and potential tools.
Cloning – or coding – Pac-Man
In the week that Anthropic’s CEO Dario Amodei said AI will write 90% of code in six months, automating software development within a year, the Guardian asked seven people to build a clone of the 1980 arcade bestseller Pac-Man using Grok. The article examines the hype surrounding AI's ability to create games, contrasting it with the reality of the often-flawed and rudimentary results. Gary Marcus writes in his newsletter this week
“The idea that coders (and more generally, software architects) are on their way out is absurd. AI will be a tool to help people write code, just as spell-checkers are a tool to help authors write articles and novels, but AI will not soon replace people who understand how to conceive of, write and debug code.”
Quick links
“This isn’t a hidden underbelly requiring deep investigation; it’s the routine experience awaiting children on hangout spaces like this on Roblox every day.”
What happens when adults and children share a virtual playground? Roblox is a virtual playground – but one with minimal safeguards, where adults and children are able to talk to each other without supervision. Revealing Reality set up a 42-year-old male avatar to explore a ‘Voice Chat Hangout’ on Roblox, accessible to users aged 13 and older. This is what the researchers saw, often within just a few minutes of entering the hangout. Roblox’s CEO David Baszucki was interviewed on the BBC Today programme yesterday and said that if parents were worried, they should not let their children use the platform.
Heads concerned: it’s the Association of School and College Leaders (ASCL) conference today and they’ve released a survey finding that online bullying between students through social media was reported by nearly three-quarters of secondary school teachers and half of primary school teachers. One in 10 secondary teachers said they knew of artificial intelligence-generated deepfake images or audio being used maliciously against students or staff. Manny Botwe, ASCL president, will call for social media firms to be "brought to heel". Meanwhile…
Gutting of social media ban bill: a Labour private member’s bill to ban social media for under-16s has been watered down to get government support. The BBC reports that Data Protection and Telecoms Minister Chris Bryant said his department had asked the University of Cambridge to run a feasibility study into the impact of smartphones and social media – and that it would work to "roughly the same timetable" as the bill was calling for.
Meeting digital and technology standards in schools and colleges: DfE has updated its guidance in how schools and colleges can meet IT service and digital equipment standards.
Media literacy 10 years on: media education organisation IAME is hosting a webinar on media literacy and information disorder next Tuesday (18 March) with Swedish academics and experts on information literacy Jutta Haider and Olof Sundin. More details, plus a previous webinar on mobile phones in schools
A new education reality: the European EdTech Alliance’s newsletter offers some insights into the changing global political landscape, including international tensions and concerning shifts in US education policy, such as the closure of the Office of Educational Technology (the EEA team has saved and stored US resources). It details the resilience of Ukraine's education system, with remote learning and shelter schools.The newsletter also covers the continuing discussion on screen time in schools, with varying national policies.
We’re reading, listening…
School swap UK to USA
Sarah's not usually a fan of reality TV shows, but having worked in and with Lambeth schools and children for many decades and as a former trustee at the brilliant Elmgreen School, she was drawn to this well-reviewed Channel 4 programme about students from Elmgreen in south London swapping places with teenagers from Mena, Arkansas in the US. The programme doesn't shy away from issues of racism, prejudice and disadvantage and the young people and teachers are impressive. Next week's episode addresses social media use in the respective school communities.
Inside the murky world of Facebook
Sarah Wynn-Williams, a senior Facebook executive for most of the last decade, talks to Emily Maitlis on The Newsagents podcast about her ‘explosive’ new memoir Careless People about how the leaders of Meta shape our world. Her allegations include Facebook’s direct impact on elections, its decision-making on content moderation and the targeting of vulnerable children and its partnership with the Chinese government on surveillance.
Give it a try
Shore Space
Thanks to Cliff Manning for sharing Shore Space, a safe online space for teenagers worried about sexual behaviour.
Connected Learning is by Sarah Horrocks and Michelle Pauli