A NEW DIGITAL DIVIDE
What once was a gap in access to technology may become a gap of human agency
a chasm opening between those who direct AI and those directed by it. A state of teaching and learning that pits big tech’s vision of AI as an intermediary between everyone and everything against educators and students who are already struggling to know when and if they can assert self-determination, or when to take the convenience of AI assistance.
A NEW DIGITAL DIVIDE
What once was a gap in access to technology may become a gap of human agency
Stanford researchers surveyed workers across 104 occupations to reveal truth about automation
Groundbreaking MIT research reveals the neurological price of cognitive convenience
"When it's so easy to get the answers we want, it might become harder to question those answers we get" - Eric Hawkinson
The Ethics of Automating Education - 2019
Let’s now transfer this idea to education, and the idea of AI assessment. It may not surprise you to know that the majority of venture investment and funding for AI in education is from testing companies and those who wish to disrupt them. Integrating AI into the test assessment process is attractive to testing companies for many reasons.
TEDx and Augmented Reality - 2018
Eric, Martin and Erin implemented their AR project at TEDxKyoto. Looking to approach the idea of AR on several fronts and link them all together Eric, Martin and Erin put together a series of activities that have never been seen at TEDx events ever before. The result was an interesting mix that got great reaction from participants.
Complete Research Archives
"These infrastructural changes are happening whether or not educational institutions participate in designing them. Students will have access to increasingly sophisticated agentic systems regardless of institutional policies."
"We shape our tools, and thereafter our tools shape us." This Marshall McLuhan insight frames an essential discussion I lead in my graduate course on educational technology in distance education—specifically the tension between technological determinism and the social construction of technology (SCOT). To help make these concepts relatable, I often share a personal example: the em dash. I used to overuse ellipses… not always correctly. But over time, I noticed something—AI-generated text often leans heavily on em dashes. After reading and working with language models regularly, I found myself doing the same.
This micro-example reveals the very relationship we explore in class: how tools influence behavior, and how user habits feed back into tool design. It's not simply that AI determines my punctuation, nor that I consciously decided to write more like machines. Instead, it's a co-evolution—efficiency pressures in AI training data shape my writing style, which then influences how I teach about writing, which affects how students write, which potentially feeds back into future training data. This seemingly trivial punctuation shift illustrates something much larger: the Automation Abyss opening between those who direct AI and those directed by it. When I unconsciously adopt em dashes, I'm being directed by algorithmic patterns embedded in my tools. When I recognize this pattern and analyze it critically—as I'm doing now—I'm maintaining agency over the technology.
This is the essence of maintaining human agency in an age of agentic AI: recognizing that we shape our tools and our tools shape us, and taking conscious responsibility for that relationship. The stakes are higher than punctuation—they're about preserving what makes us essentially human learners and teachers in an increasingly automated world.
One punctuation mark at a time.
Eric is a learning futurist, tinkering with and designing technologies that may better inform the future of teaching and learning. Eric's projects have included augmented tourism rallies, AR community art exhibitions, mixed reality escape rooms, and other experiments in immersive technology.
With over two decades of experience teaching with technology, Eric has witnessed firsthand the evolution from technology as supportive tool to substitute for essential learning processes. A pioneer in augmented reality education, he's the creator of Reality Labo, My Hometown Project, and Corefol.io, platforms that help students take ownership of their learning journeys.
Roles
Professor - Kyoto University of Foreign Studies
Research Coordinator - MAVR Research Group
Founder - Together Learning
Developer - Reality Labo / Corefol.io / My Hometown
Chair - World Immersive Learning Labs
A series of thought experiments to pit possible outcomes and analyze the common characteristic.
Scenario 1: The Automation Dependency Future
Students become passive consumers of AI-generated content, losing the capacity for independent thought and creative problem-solving.
Scenario 2: The Resistance Backlash Future
Educational institutions ban AI entirely, leaving students unprepared for an AI-saturated world and increasingly irrelevant educational experiences.
Scenario 3: The Human-AI Co-learning Future
Thoughtfully designed educational experiences that preserve human agency while leveraging AI capabilities for enhanced learning.
Scenario 4: The Educational Obsolescence Future
AI systems provide more efficient educational services than traditional institutions, leading to the complete replacement of human-centered education.
"Maintaining meaningful human agency when artificial intelligence can autonomously plan, discover, and even fail on behalf of learners represents one of the most critical educational imperatives of our time."