Skip to main content
Back to Insights
AI Strategy5 min read10 February 2026

Intent Engineering in Higher Education

Klarna's AI agent resolved customer tickets in 2 minutes instead of 11. Then customers started leaving. Universities are walking into this exact trap.

Share:

Klarna's AI agent resolved customer tickets in 2 minutes instead of 11. It handled 2.3 million conversations a month. It saved $40 million a year.

Then customers started leaving.

The AI was brilliant at resolving tickets fast. That was the wrong goal. Klarna's real purpose was building lasting customer relationships. A human agent with five years at the company knew when to bend a policy, when to spend three extra minutes, when efficiency mattered and when generosity did. The AI knew none of it.

This is the difference between context engineering (what does AI need to know?) and intent engineering (what does AI need to want?).

Universities are walking into this exact trap.


Every major institution is deploying AI: automated marking, admissions chatbots, AI tutors, student support bots. The tools work. The alignment does not exist.

What universities say they value: deep learning, critical thinking, student transformation.

What their AI agents are optimised for: grading speed, enrolment conversion, queries resolved per hour.

An AI tutor optimised for throughput will give a confused student a link to the textbook. A human tutor with ten years of experience knows that confusion is sometimes a breakthrough moment that needs twenty extra minutes.

An AI admissions bot optimised for conversion will never tell a prospective student "this program might not be right for you." A human admissions officer sometimes does, because matching matters more than numbers.

The financial pressure makes it worse. Universities Australia claims international education is a "$52 billion export engine." That figure is wildly inflated. The ABS counts all spending by international students as exports without subtracting money earned domestically, remittances, or agent commissions. The real net figure is much lower.

Overstated revenue creates financial fragility. Fragility creates cost pressure. Cost pressure drives AI adoption aimed at efficiency. And efficiency-optimised AI agents encode the institution's real priority (cost reduction) rather than its stated priority (student outcomes).


Here is what most people miss about this story.

Teachers already do intent engineering. They call it professional judgment.

When a teacher spends extra time with a struggling student instead of moving to the next lesson, that is intent alignment. When a teacher bends a late submission policy because they understand a student's home situation, that is an encoded trade-off hierarchy. When a teacher notices a behaviour change and calls the counsellor, that is an escalation boundary.

No rubric told them to do any of it. They read the room.

The emerging role of "AI Workflow Architect" (someone who ensures AI systems align with organisational purpose) is a description of what teachers do every day applied to a different domain. The people who understand educational intent better than anyone are the people being displaced by AI agents that lack it.


Five things universities need to do

  1. Define success in educational terms before deploying any AI agent. If your marking tool's KPI is "assignments graded per hour," you have already failed.
  2. Create the AI Pedagogy Architect role. Someone between IT, learning design, and academic leadership who understands all three.
  3. Make educational intent machine-readable. Curriculum design always assumed a human interpreter. AI agents need explicit logic: when to scaffold, when to instruct, when to escalate.
  4. Involve the people you are replacing. Their institutional knowledge is exactly what the intent layer requires. Klarna had to rehire the agents it fired. Universities do not have to learn it that way.
  5. Measure alignment, not just efficiency. Build feedback loops that detect when AI drifts from educational purpose.

Klarna lost customers. They are recovering.

Universities will lose learners. A generation processed efficiently but never actually educated. That damage does not appear in quarterly earnings. It appears in graduates who received technically accurate feedback that never challenged their thinking.

The tools work. The models are extraordinary. What is missing is the intent layer.

The good news: the people who understand educational intent are still here. Many are looking for what comes next. The AI era needs exactly what they know.

The question is whether institutions will ask them before deploying the agents, or learn the hard way.


What do you think? Are universities thinking about intent alignment, or just deployment speed?


Go Deeper

If this resonates, you might find value in these resources:

  • The Capable Organisation — My book on building internal AI capability instead of outsourcing it. Covers the strategic thinking behind intent engineering and why most AI deployments fail.
  • AI for Parents and Teachers — A practical guide for educators navigating the AI transition, including how to preserve what matters while embracing what helps.
  • Join the Community — Connect with other educators and leaders working through these questions. Weekly discussions, shared resources, and a network of people building what comes next.
AI StrategyEducationLeadership
JL

Written by

Jason La Greca

Founder of Teachnology. Building AI that empowers humans, not replaces them.

Connect on LinkedIn

Is your organisation building capability or just buying it?

Take the free 12-minute Capability Assessment and find out where you stand. Get a personalised report with actionable recommendations.

Intent Engineering in Higher Education | Insights | Teachnology