How we used to learn
There were acronyms I used to invent in school. They weren’t clever, but they helped ideas stick in my mind. A nonsensical phrase that made sense only to me was often enough to trigger recall when it mattered. Around me, other students filled up assessment books, working through pages like a ritual. Back in Singapore when I grew up, completing the 10-year series wasn’t just about revision. It was a kind of quiet contract of students putting on a front and showing what serious learning looks like. There was pride in the repetition. This effort had a shape, and it was easy to recognise.
We rarely questioned what counted as learning. Memorising definitions, mimicking model answers, repeating strategies that seemed to work. All of it looked and felt like effort. But now, I look back and realise how much of that was about performing readiness rather than building understanding. We were always relying on tools and structures outside ourselves. Today’s tools just operate faster and feel less familiar.
Stretching knowledge with AI
AI entered my learning life as an experiment. While some people use it to search and query for new knowledge, I use it to challenge what I already thought I knew. I would prompt it to test assumptions, to mirror back contradictions, to ask questions that hadn’t occurred to me. Sometimes it affirmed a hunch. Other times, it destabilised a framework I thought was solid. I used AI to stretch my learning by shaking the knowledge I held with certainty.
Most of this happens in private. Not because I’m hiding it intentionally, but because I don’t yet know how to talk about it clearly. I don’t have a replicable method. I only know it works for me. And maybe that’s why I hesitate to teach it as a process. It is neither neat nor scalable. It’s messy, iterative, and full of false starts. But it leads me to somewhere real on my learning journey. I come out of those sessions thinking differently, or at least seeing more of what I didn’t before.
Learning with AI also requires more from me. It pulls at my attention in unexpected ways. I feel myself learning, unlearning, and relearning in loops. It isn’t the gradual build up of knowledge I was familiar growing up with. Learning with AI is kind of cognitive agility training. Harder, yes. Richer, too. In many ways, this mirrors how I described the need for mental gyms, deliberate spaces to sharpen our minds in the age of intellectual outsourcing.
What students expect, and what we see
In the classroom, things get more complicated. Some students expect content to be delivered clearly, confidently, and without gaps. They see the lecturer as someone who should pass down knowledge with precision. So when I introduce AI as a partner in learning, or suggest strategies that might not work the same way for everyone, it doesn’t always land. Some students feel thrown and the uncertainty in learning feels like disorganisation. Freedom to discover personal ways of learning looks like confusion to them.
But I also see students who take AI into their own hands. They shape prompts, remix ideas, explore alternatives. They don’t wait to be told how to use the tool. They try things. And when they do, I feel a quiet sense of reassurance. These are learners who can navigate change. They are not just absorb content but engaging with it. They question it. They build something from it. That’s what stays with me.
There’s pride in witnessing students who use AI beyond convenience. A kind of pride I wrote about before, when I tried to draw the line between caring and carrying student learning. That reflection helped me realise how easily emotional labour can slip into unsustainable responsibility when students don’t show up for themselves.
The cost of detachment
Of course, there are boundaries I still hold. If a student uses AI to generate work and then claims ignorance about its origins, something in that exchange breaks down. It’s not the tool that troubles me. It’s the disconnect between the work and the awareness of how it came to be. Engagement matters. The outcome alone doesn’t say enough. And yet, even these moments point to something deeper, that the frameworks we’re using to define ownership and effort of learning are outdated. In a post-plagiarism world, it’s has to move past detecting dishonesty and towards cultivating integrity through awareness, care, and transparency. When a student fails to understand or acknowledge their use of AI, the issue isn’t probably about the disconnect between themselves, learning and the demonstration of skills.
Learning has always been supported
I’ve written before about how slow scholarship is fading in a system obsessed with performance. That same system pressures students too. We assess them based on the submission that we expect to capture applied knowledge, which does not necessarily equal the process of their learning. And so students learn that an outcome that evidence applied knowledge is more important than them learning and trying to apply the learnings.
And to go through the process of learning thoroughly, that journey has to be scaffolded by people, texts, routines, and now, tools. AI simply makes those supports more visible. It accelerates and externalises what used to be tacit. And if we can name that honestly, without fear or shame, we might find better ways to support each other through this learning shift. The more we resist the image of “pure” learning, the more space we can create for real growth. After all, learning alongside AI is still learning. It invites effort, reflection, and adaptation. It asks us to be more deliberate. And while it may not follow traditional patterns, it offers its own kind of depth. A depth that’s worth exploring.
What’s something you’ve learned recently, not from AI, but alongside it?
