AI, Stories, and the Distributed Mind: Rethinking Language Learning
Michael Rabbidge
When I first started teaching, learning looked a lot like a solo sprint. Students read the textbook, completed the exercises, and maybe asked a question or two. But today, learning feels more like a relay—ideas passed between people, platforms, and tools. If you’re using Chasing Time English, you’re already in that space: storytelling, collaboration, and digital tools working together to bring language learning to life. Add AI to the mix, and you’re stepping straight into what educational theorists call distributed learning.
At its heart, distributed learning challenges the old-school idea that learning happens only inside a student’s head. Instead, it sees thinking and learning as something that’s shared—across people, tools, time, and environments. A student doesn’t just learn from a teacher or a worksheet—they learn from their classmates, their devices, the videos they watch, and the tools they interact with. Think Google Docs, AI chatbots, subtitles, episode recaps, or peer feedback. All of these form part of a broader learning network.
Chasing Time English is built for this. Its story-driven episodes, complete with authentic dialogue and engaging characters, already serve as context-rich anchors for language learning. When learners engage with the story—predicting plot twists, rephrasing dialogue, or debating character motives—they’re not just thinking linguistically. They’re also learning with their whole selves—drawing on memory, emotion, movement, and social connection. This kind of active, embodied engagement makes language stick.
Now imagine pairing this with AI.
After watching an episode, students can use AI to summarise the plot, explore new vocabulary, or generate questions to spark discussion.
They might analyse a character’s speech, then use a chatbot to rephrase lines using different grammar structures.
They could ask AI to write an alternative scene, then compare it to the original and evaluate tone, voice, and register.
Or they could record their own summaries, feed them into speech-to-text tools, and reflect on pronunciation and accuracy.
In each of these moments, the student, the story, the AI, and the task are all working together. This is learning embedded in action—language comes alive when it’s used to do something meaningful, not just remembered for a test. It’s not about replacing thinking but expanding where and how it happens.
One way to use CTE is to set a group task after an episode—something like rewriting the ending or scripting a sequel. With AI, students can brainstorm ideas, check grammar in real time, or even explore how a phrase might be interpreted differently in another context. The AI becomes a kind of thinking partner—an extension of the learner’s mind that helps them test out ideas, rework sentences, and reflect on choices.
This is especially powerful for multilingual learners who may need more time to process or more scaffolding to participate fully. The key isn’t to avoid AI—it’s to help students use it meaningfully, as part of a supportive learning ecosystem. Every sentence they type, rework, or speak aloud adds to their internal map of how English works. It’s language learned by doing, not just by studying.
So, where do educators fit into this AI-enhanced, story-rich, tool-saturated world?
We design the environment. We guide students not just in what to learn, but how to move between modes—talking, reading, watching, creating—and how to connect their thoughts across tools, peers, and platforms. With CTE, we can build lessons that flow from video to group discussion, to chatbot dialogue, to reflective writing—where every step reinforces the language, not just rehearses it.
And just as CTE stories invite learners to imagine new futures, we can help students imagine themselves as part of a bigger learning network—curious, connected, and confident.
In this new learning landscape, the smartest classroom might not be the one with the most content, but the one that’s most connected. With Chasing Time English and AI tools working side by side, we’re not just teaching English—we’re helping learners thrive in a world where language is social, embodied, and always in motion.
Note for Educators:
This blog draws inspiration from Nick C. Ellis’s (2019) work on language cognition as a distributed, usage-based, and socially embedded process. His theory—often referred to as “4E cognition”—emphasizes that learning is embodied (rooted in sensory and motor experience), embedded (shaped by environments and tools), enacted (emerging through action), and extended (distributed across people and technologies). These ideas support the use of multimedia storytelling and AI as meaningful components in language learning environments. For more, see: Ellis, N. C. (2019). Essentials of a theory of language cognition. The Modern Language Journal, 103(S1), 39–60.