Get the full story
Understand what we’re risking and what we’re gaining when AI is a cornerstone of our classrooms.
With almost 80% of Australian students using generative AI tools like ChatGPT and Microsoft Copilot, the role of AI in education is no longer a question for the future – it’s a present-day reality. 1
To unpack the effects of AI on children’s education, Alex Jenkins joined The Future Of for a wide-ranging conversation on how AI could reshape learning over the next decade – and how students and educators can respond to it right now.
Alex Jenkins is a leading voice on AI adoption in education in Western Australia, and also advises universities, government and research institutions on how AI can be used responsibly at scale.
Below are highlights from the discussion. You can listen to the full episode, The Future Of AI in Education, on Apple Podcasts, Spotify and more.
Classrooms are built around shared timelines, but students don’t all learn at the same pace. Alex explains how AI could change that.
“It’s helpful to start by thinking about how a normal classroom works. Say we’re studying maths – we go into class, learn about fractions for three weeks, and then we move on to number properties.
“But the reality is that students don’t learn at the same rate – especially in areas like mathematics – yet they’re expected to follow the same learning schedule.
“What we see is that students don’t gain complete mastery over topics, and over time, they lose confidence in learning new things. A lot of students end up saying, ‘I can’t do maths’ and they internalise that.
“This is almost a universal experience, and it’s a failure of our education systems.
“On the other hand, we know that when students are tutored one-on-one, their performance improves significantly.
“Before AI, we never had the capacity or resources to give every child a tutor. But now, we have technology that can understand each student’s strengths, how they learn, and support mastery before moving on.”
“I hope that we can deploy generative AI like this at scale. By 2035, teaching could be supported by technology that acts as a personal tutor, guiding students through their educational journey.”
If AI tutors are deployed at scale in our classrooms, how will they be regulated, who will own them, and what happens to the data they collect? Find out in the full episode.

Traditional university assessments assume students work, think and write independently – but generative AI has changed that almost overnight.
“I once could ask my students to write an essay on the history of World War II in two weeks and then mark that essay. Now we live in a world where an AI can generate a high-quality essay in 30 seconds.
“Universities around the world – Curtin included – began using tools like Turnitin to detect AI-generated content. But these tools are never 100% reliable – you end up with students accused of cheating when they haven’t, and students who have cheated slipping through the net.
“The proliferation of AI platforms requires a proper rethinking of how we assess learning – because the technology isn’t going away – and students will be using it in the workforce.”
“Curtin is running a program called Assessment 2030, which asks: what do we want students to leave university with, and how do we assess those skills meaningfully?
“I’d love to see exams that look very different. We’re already seeing things like the return of oral examinations. But imagine coming into an exam, sitting down at a university-provided computer, and being examined with AI as part of the process.”
So, which skills still matter in an AI-enabled world and how are universities redesigning assessment to reflect that? In the full episode, you can discover how learning and assessments are changing.
AI is already part of how many students study – the challenge is learning how to use it without it doing the thinking for you.
“I have two top tips for students who use AI.
“My first piece of advice is to learn how to use AI in a way that supports your thinking, rather than replaces it.
“If you’re working on a problem, say to the AI: I need to understand how to do this. Don’t give me the answer – guide me towards the correct method.”
“Essentially, treat AI more like a tutor than an answer machine. That relationship matters, and it’s as simple as telling the AI how you want it to behave. It will respond differently – and respect those boundaries.
“I’d also strongly advise students always to ask the AI for links, for sources, and for where its claims are coming from – then dig into the primary source yourself.
“These systems still hallucinate – they make things up, just like people do. And particularly in an academic setting, that’s not acceptable.
“Verifying sources is an important skill in education and learning. Fact-checking helps ensure AI is not misleading you.”
For educators, using AI well starts with curiosity – and a willingness to test its strengths and limitations in real classroom contexts.
“It really comes down to experimenting with AI. See what it does well and what it doesn’t do well – and don’t be afraid of it.
“There are all kinds of classroom activities you can do – like asking the same question to AI models from different regions, say one from the US and one from China, and comparing how they respond.
“Educators can also use AI to support more inclusive and accessible learning experiences.
“Say you have a lesson plan, but you also have a student with additional needs – maybe someone on the autism spectrum with a special interest. You can take that lesson plan and make it about trains, or soccer players, or whatever their interests are.
“The Autism Academy recently won a large grant from the Department of Communities to help build generative AI activities for neurodiverse students. I was a partner in that project, and it’s fantastic to see educators really benefitting from AI in practical ways.
“It’s also important that educators treat AI output as if it’s coming from an untrusted individual. If it makes a claim you’re unsure about, always ask where it got that information from.”
Understand what we’re risking and what we’re gaining when AI is a cornerstone of our classrooms.
1 Australia Digital Inclusion Index (2025) Case study: The AI Divide in Australia, Australia Digital Inclusion Index, accessed 19 Jan 2026.