2026 EdTech Predictions
Insights from 20 leading voices in edtech, venture capital, and education policy
Thanks to our Presenting Sponsors Hire Education, Starbridge, and Cooley for making the Edtech Insiders newsletter, podcast, and community possible.
Trust, Integration, and Transformation: 2026 EdTech Predictions
As always, we’re expecting big changes for EdTech in 2026. This year, we gathered perspectives from 20 leaders across the education ecosystem—from Google and YouTube to frontier EdTech companies, from veteran investors to nonprofit innovators—to understand what’s coming next. What emerged wasn’t just a list of predictions, but a portrait of an industry at an inflection point.
For our full interviews on this topic, check out our year end episodes on The Edtech Insiders Podcast!
Contributors
Hosts
Alex Sarlin – Co-Founder of Edtech Insiders, Senior Advisor at Cambiar Education
Ben Kornell – Co-Founder of Edtech Insiders, CEO at Art of Problem Solving
Venture & Investors
Amit Patel: Co-Founder & Managing Partner, Owl Ventures
Jennifer Carolan: General Partner, Reach Capital
Jomayra Herrera: Partner, Reach Capital
Dan Carroll: Co-Founder, Clever
Company & Platform Leaders
Shantanu Sinha: Director of Education, Google for Education
Katie Kurtz: Managing Director & Global Head of Youth & Learning, YouTube
Andrew Goldman: EVP, HMH Labs
Arman Jaffer: Founder & CEO, Brisk Teaching
Juliette Reid: Director of Market Research, Reading Horizons
Soren Rosier: Founder, PeerTeach
Sofia Fenichell: Founder & CEO, Study Hall.AI
Bjorn Billhardt: CEO, Abilitie
Non-Profit Leaders
Tom Vander Ark: Founder & CEO, Getting Smart
Jean-Claude Brizard: President & CEO, Digital Promise
Amanda Bickerstaff: CEO, AI for Education
Core Edtech Insiders Community Leaders
Sunil Gunderia: Co-Founder & CEO, Stealth
Jacob Kantor: Founder & Chief DODO, JK K12 EDU
Ian McCullough: Founder & Principal, McCullough Marketing and Management
From Promise to Practice: AI Crossed the Adoption Threshold in 2025
The biggest shift in 2025 wasn’t technological—it was behavioral. AI stopped being something educators talked about and became something they actually used.
“This was the year AI went from promise to practice,” says Shantanu Sinha, VP of Google for Education. “The conversation shifted to what educators are actually doing with it.”
The numbers are striking. Andrew Goldman at Houghton Mifflin Harcourt reports six times more educators using AI compared to 2023, with nearly four in five feeling confident using it in their classroom—”astonishing given how slow adoption has historically been in education.”
Teachers moved beyond productivity hacks. Arman Jaffer, CEO of Brisk Teaching, saw them “asking how to introduce it to students in ways that preserve productive struggle, especially around writing and feedback.”
Dan Carroll, an education leader and policy voice, puts it simply: “What changed for me this year is that AI stopped feeling aspirational. It became something people were actually using in the flow of learning, not as a demo or a side experiment.”
For Dr. Soren Rosier, CEO of PeerTeach, the shift was psychological: “Learners stopped waiting for permission. AI became something people used independently to move faster, reskill, or get unstuck.”
The Center of Gravity Shifts: From Workflow to Instruction
Early AI adoption followed a predictable pattern—administrative tasks, lesson planning, email drafts. But 2025 marked a shift inward, toward the instructional core itself.
“We’re moving from generic AI assistance to instructional intelligence,” explains Sunil Gunderia. “That means grounding AI in curriculum, pedagogy, and learning science—not just the wisdom of the internet.”
Jean-Claude Brizard, president of Digital Promise, saw the conversation evolve: “It moved beyond workflow efficiency to curriculum, instruction, and assessment—what teaching and learning actually look like in an AI-enabled world.”
Once AI enters the instructional core, the stakes change. As Ian McCullough puts it: “You can’t hide behind activity anymore—you have to be explicit about what learning actually is.”
For Arman Jaffer, that clarity comes down to feedback: “What teachers really want is help giving better feedback at scale. AI started to earn its place when it made formative feedback more frequent and more actionable.”
Assessment Is Breaking—And 2026 Forces a Redesign
AI didn’t break assessment—it exposed how fragile our measurement systems already were.
Jacob Kantor frames it clearly: “Assessment has always been the constraint in the system. AI just made it obvious that our tools for measuring rigor, understanding, and growth don’t match the kinds of learning experiences we say we value.”
On the same beat, Ben Kornell is clear: “The five-paragraph essay is unsustainable as a default assessment. AI didn’t create that problem, but it made it impossible to ignore.”
Globally, alternatives are emerging. Jennifer Carolan notes movement toward oral exams and dynamic demonstrations of critical thinking, especially outside the U.S. Tom Vander Ark sees portfolios and client-connected projects gaining traction “because a single test score can’t tell the full story of a learner anymore.”
The solution, according to Arman Jaffer: “When students can generate drafts instantly, assessment has to shift toward revision, reasoning, and process. Otherwise, we’re grading the wrong thing.”
Jacob Kantor drives it home: “We have to assess process, reasoning, and growth—or we’re measuring the wrong thing.”
Trust and Reliability Became the Bottleneck
Innovation is easy. Trust is hard. And in 2025, trust became the gating factor for AI adoption at scale.
Katie Kurtz, head of learning at YouTube, identifies the gap: “Moving from pilots to system-level adoption requires confidence that tools are reliable, safe, and aligned with real classroom conditions.”
The bar is unforgiving. Shantanu Sinha: “We would never accept a textbook with errors on ten percent of the pages. AI has to meet the same bar.”
Amit Patel, managing director at Owl Ventures, explains why consistency matters: “If two teachers ask the same question and get different answers, trust breaks immediately.”
The issue extends beyond technical reliability. Ben Kornell notes that “the backlash isn’t really about the technology. It’s about whether people feel AI is being done to them rather than with them.”
Jomayra Herrera captures the institutional challenge: “Institutions move at the speed of trust. If AI doesn’t earn legitimacy with educators and families, it won’t matter how powerful it is.”
The AI Tutor Finally Becomes Viable
For years, AI tutoring has been more promise than product. 2026 may change that.
“I think 2026 is the year the AI tutor becomes the hero story,” says Sofia Fenichell, CEO of Study Hall AI. “Multimodal interaction, lower cost, and better data finally make high-quality tutoring accessible at scale.”
The impact potential is significant because tutoring has always been limited by access. Dan Carroll emphasizes that the goal is reach, not replacement: “It finally gives motivated learners consistent support when humans aren’t available.”
Recent evidence suggests the technology is maturing. Sunil Gunderia points to research showing AI-supported tutors matching human effectiveness, proving “that quality tutoring can finally scale without replacing people.”
Big Tech Is Building the Learning Layer
While EdTech companies competed for market share, something more fundamental was happening: the world’s largest technology companies began building learning directly into their platforms.
Bjorn Billhardt, CEO of Abilitie, describes the shift: “What’s changing isn’t just competition—it’s the collapse of layers. Creation, distribution, and intelligence are converging, and education sits right in the middle.”
The implications are profound. Ben Kornell warns: “If Google Classroom distorted the LMS market, what’s coming now is on a completely different scale.”
When AI becomes infrastructure rather than a feature, the competitive landscape transforms. For EdTech companies, survival may depend on specialization. Ben Kornell: “EdTech companies will need to focus on niches that are too small or too specific for Big Tech to care about.”
Alex Sarlin captures the challenge: “The scary part isn’t that Big Tech will buy EdTech companies—it’s that they won’t need to. Many frontier labs will sign campus-wide deals with universities in 2026.”
Learning, Labor, and Capital Markets Are Colliding
Education has always been connected to work, but traditionally as separate life stages. AI is collapsing that separation.
“We’re seeing learning, credentials, and labor markets collapse into each other,” says Jomayra Herrera. “Capital is flowing toward models that connect learning directly to economic mobility.”
The pressure is visible in public sentiment. Jennifer Carolan cites a stunning statistic: “Two-thirds of registered voters no longer think a college education is worth the cost—a dramatic reversal from a decade ago.”
Dr. Soren Rosier sees this globally: “We’re seeing learning and earning collapse together, especially outside the U.S. Credentials matter less when skills can be demonstrated continuously.”
The result, according to Alex Sarlin: “Higher education is entering a defensive posture, responding to pressure from students, parents, and the labor market all at once.”
Human Connection Becomes the Scarce Resource
As AI scales cognitive work, a counterintuitive truth emerges: human connection becomes more valuable, not less.
Ian McCullough frames the paradox: “The work that remains distinctly human—coaching, judgment, relationships—becomes more valuable as everything else scales. As systems get smarter, the human work becomes more visible. Coaching, care, and judgment don’t disappear—they become the point.”
Sunil Gunderia sees this as the defining question: “The fundamental question is whether AI strengthens human connection or erodes it. That’s the line education has to hold.”
For practitioners, the best uses of AI create space for better human interaction. Juliette Reid observes: “The most powerful uses of AI are the ones that create space for better teaching, not less of it.”
Choice and ESAs Are Fragmenting the System
While these technological transformations unfold, a parallel structural shift is accelerating: public education dollars are moving directly into families’ hands.
Jennifer Carolan identifies the trend: “One of the biggest structural shifts we’re seeing is public dollars moving directly to families through ESAs. The fastest-growing use is small schools under 30 students—micro-schools—and that’s happening much faster than most people realize.”
When families control funding, markets reorganize. Ben Kornell: “We’re watching the system fragment in real time. What used to be a single pipeline is breaking into many parallel paths.”
AI accelerates this by making small-scale alternatives economically viable. Alex Sarlin: “What’s different now is that AI makes small, distributed learning models viable at scale. Choice plus technology changes the economics of what a ‘school’ can be.”
For EdTech companies, this creates opportunities. “EdTech companies may find very nimble small schools actively looking for the coolest, most motivating, most parent-friendly EdTech they can put in their brochures,” notes Alex Sarlin.
What This Means for 2026
These themes don’t exist in isolation—they reinforce and accelerate each other. AI moves to the instructional core, which breaks traditional assessment, which demands new credentialing, which fragments delivery models, which shifts market power toward families, which creates opportunities for new providers, which attracts Big Tech investment, which changes the competitive landscape for everyone. (try saying that 10 times fast!)
The throughline is transformation. Education in 2026 won’t look like education in 2024, and the changes won’t be confined to classroom technology. They’ll reshape what we measure, how we organize learning, who provides it, and ultimately what we believe education is for.
As Bjorn Billhardt puts it: “What AI really changes is speed—how fast ideas turn into products and products turn into behavior. In education, that compression of cycles is going to matter more than any single tool.”
Ready or not, 2026 is here.
Listen to the Full Conversation
While this article captures a few key insights from our discussion with these exceptional EdTech industry leaders, it only scratches the surface.
Tune in to the full podcast episodes to experience the complete conversation!
This edition of the Edtech Insiders Newsletter is sponsored by Tuck Advisors.
Since 2017, Tuck Advisors has been a trusted name in Education M&A. Run by serial entrepreneurs with over 30 years of experience in founding, investing in, and selling companies, you deserve M&A advisors who work as hard as you do.
Not all LOIs are the same. Use our free Deal Value Calculator to estimate the relative expected value of your LOIs.
Have questions or are ready to discuss M&A? Reach out to us.
Turning Trusted Books into Personal Learning Agents with Aibrary
We recently interviewed Frank and Susan of Aibrary on The Edtech Insiders Podcast!
Frank Wu is the Co-Founder of Aibrary and a Harvard Kennedy School MPP graduate. He led 20+ EdTech and AI investments at TAL, helped build Think Academy in the U.S., and previously taught 3M+ students.
Susan Wang is the Chief Growth Officer at Aibrary and a Yale and Harvard Business School alum. She led creator and product operations at TikTok and worked in strategy at TAL, with deep experience scaling EdTech products.
5 Things You’ll Learn in This Episode
How AI is reshaping lifelong learning and adult skill development.
Why Aibrary turns trusted books into personalized, bite-sized audio.
How Idea Twin customizes learning using your goals and context.
How Aibrary blends learning with action to drive real-world application.
Aibrary’s long-term vision to build an AI-native university.
We love to collaborate. To learn more about partnership and sponsorship opportunities, please email info@edtechinsiders.com. Thanks for reading!










Reading your work, Alex, it struck me that when AI enters education, the real issue isn’t building a “smarter tutor,” but what happens to the meta-layer where intention, explanation, and human judgment used to operate throughout the learning process.
In education, this meta-layer is where students’ intent is interpreted, mistakes are explained rather than just corrected, feedback is calibrated, and teachers intervene with contextual judgment. When that layer erodes, learning risks becoming efficient but shallow.
I’ve tried to articulate this risk—and the design implications—in a recent piece that explores how AI itself points to the disappearance of this meta-layer, and why preserving it matters if we want AI to genuinely support learning rather than replace its meaning-making core.
Sharing it here in case it resonates with your ongoing work in learning experience design: https://northstarai.substack.com/p/ai-spoke-of-a-meta-layer-in-its-own