EdTech Measures Everything. Except What Makes Us Human.
It's time to measure what truly matters.
The Illusion of Personalization
I remember the first time I watched a student navigate an “adaptive learning” platform. The screen lit up with encouraging animations. A cheerful voice praised every click. The pathway adjusted in real time based on performance.
It looked like personalization. It felt like agency.
But then I asked the student to solve the same type of problem on paper and pencil. She couldn’t do it.
The tool hadn’t taught her to think. It had taught her to click. I’ve realized we’d spent thousands of dollars and hundreds of hours training students to be good at something that doesn’t transfer to real life. We called it personalized learning. But nothing about what makes students uniquely human is seen.
We’d personalized the delivery. We’d standardized the child.
The Quiet Scandal
We’re teaching kids to be excellent at tasks AI will do for free.
And we keep doing it. Year after year. Tool after tool.
This week, Unscripted Intelligence had a conversation with Dr. Shannon Terry and Dr. Kecia Ray, two women who’ve spent decades inside the machinery of educational technology. When asked what she would eliminate from edtech if human intelligence were truly at the center, Shannon said something that’s stayed with me long after we finished recording:
“Right now, when I’m engaged with technology, I’m a consumer. And I would love to be the author of my learning.”
Isn’t that the whole problem?
We’ve built tools that position learners as consumers. We’ve designed systems that require students to answer, not ask. Follow adaptive pathways that feel personalized but are really just efficiency at scale. To click through content instead of creating meaning from it.
Kecia reminded us that edtech didn’t start this way. It started with intention—teachers needing tools that could genuinely help them serve students better. But somewhere along the way, the system’s hunger for accountability, standardization, and data turned those tools into surveillance.
Every “personalized learning” platform I’ve seen personalizes the path but standardizes the destination.
And here’s the part that keeps me up at night… we’re about to do it all over again with AI-driven tools.
The Human Counterforce
Shannon introduced a concept during our conversation that pushed my thinking: the idea of the human counterforce.
For every function AI performs—prediction, pattern recognition, data ingestion—there must be a corresponding human capacity we’re intentionally cultivating.
AI predicts. Humans project with wisdom and foresight.
AI ingests data. Humans perceive, reflect, and synthesize.
AI optimizes pathways. Humans create meaning from the journey.
She described what this could look like in practice: dashboards that don’t just show trends, but reveal consequences. Tools that help students visualize their own learning journeys. Interfaces that slow you down to observe, notice, and connect—not just consume.
This sparked for me: What if every dollar we spend on AI efficiency required an equal investment in human depth?
What if every dashboard that tracks speed had to show growth in curiosity?
What if we refused to buy any tool that couldn’t answer: “How does this make the student more human?”
We’ve been building technology to make learning faster. What if we built it to make learning deeper?
The Opportunity We’re Afraid to Take
Here’s what struck me most: we keep pouring millions into tools that track mastery, completion rates, and time on task—while the things that actually matter most remain invisible. Intuition. Empathy. Adaptability. Thinking.
We have the technology right now to measure self-awareness, track curiosity, and visualize creative growth. Imagine a dashboard that shows a student’s growth in asking questions over time. Or tracks how their empathy deepens through collaborative problem-solving. Or reveals the moment curiosity ignited and they went deeper than the assignment required.
The tools exist. But we’re still buying software that measures “time on task” and “mastery speed” because that’s what the accountability system rewards.
We’re not measuring what matters because we’re too afraid to stop measuring what doesn’t.
This is the redesign opportunity in front of us.
Build tools that make human intelligence visible, measurable, and central—not an afterthought to academic performance, but the foundation of it.
The Unavoidable Truth
We finally have the technology to measure what makes us human. The question is whether we have the courage to use it.
If students became the authors of their own learning—if we measured curiosity, adaptability, and empathy with the same rigor we measure test scores—everything would change.
The tools would change. The classrooms would change. The entire system would have to reorganize itself around human intelligence instead of efficiency.
That’s not a future possibility. That’s a choice we can make right now. The technology exists. The framework exists. What’s missing is the courage to measure what actually matters.
💬 I’d love to hear from you: If we started measuring the human skills that matter most, what would change in the education system tomorrow?
🎧 Listen to the full conversation: “Redesigning EdTech for the Intelligence Age” on Unscripted Intelligence
🌐 Join us: humanintelligencemovement.org


Such deep insights. It makes me nervouse that not enough of the population are having these deep level conversations and insights.