Skip to main content
Back to Insights
Higher Education11 min readDecember 2024

Universities Won't Be Replaced by AI. They'll Become Its Foundation. (Or They'll Disappear.)

The choice isn't whether to change. It's whether to lead or be bypassed entirely.

Let me tell you what's coming for universities.

A student sits down with a question about quantum mechanics. Instead of attending a lecture, waiting for office hours, or hoping a tutor is available, they open an AI assistant. Within seconds, they get a PhD-level explanation tailored to exactly what they don't understand. The AI asks follow-up questions, identifies misconceptions, adjusts its approach based on the student's responses. It's patient, available 24/7, and costs nothing.

The next day, the same student has a question about constitutional law. Same process. Then organic chemistry. Then economic theory. Then Renaissance art history.

Every subject. Every level. Every time. Free.

Now explain to me why that student should pay $50,000 a year to sit in a lecture theatre.

This isn't a hypothetical. This is happening now. The technology exists. The only question is how fast it spreads, and what's left of universities when it does.


The Existential Maths

Let's be brutally honest about the numbers.

0.924

Effect size of AI tutoring on learning outcomes

89%

Students already using AI tools for academics

47%

Faculty using AI (a 42-point gap)

Recent meta-analyses show AI tutoring produces effect sizes of 0.924 on student learning outcomes. That's not marginally better than traditional teaching. That's dramatically better. The largest effects documented in educational research.

Students have already figured out that AI can help them learn. Many of them have figured out it can help them learn better than their lectures do. The only thing keeping them enrolled is the credential, the piece of paper that says they completed the degree.

How long before employers figure out that the credential doesn't mean what it used to?


The Content Business Is Over

For a century, universities have operated on a simple value proposition: we have knowledge you can't get elsewhere, and we'll give you access to it.

That value proposition is dead.

There is nothing in a standard undergraduate curriculum that a motivated person can't learn through AI-assisted self-study. Nothing. The lectures, the readings, the tutorials, all of it is reproducible by anyone with an internet connection and curiosity.

"But what about the nuance?" you say. "What about the expertise? What about the Socratic dialogue with brilliant academics?"

Be honest. How many students actually experience that? How many courses are taught by tenured experts versus overworked sessional staff? How many "tutorials" are actually transformative intellectual experiences versus box-ticking exercises?

The idealised version of university education was always rare. AI doesn't have to beat the best university experience. It just has to beat the average one. And the average one isn't hard to beat.

Universities that think they're in the content delivery business are about to discover they're in a business that no longer exists.


But Here's What AI Can't Do

Before you think I'm writing universities' obituary, let me be clear: there are things AI cannot do. Critical things. Things that could form the foundation of what universities become.

AI can't validate knowledge.

AI can generate plausible-sounding information on any topic. It can also hallucinate complete nonsense with absolute confidence. Peer review, academic rigour, institutional reputation, these are trust mechanisms that took centuries to build. When someone needs to know if information is reliable, they look for credentialed sources. That credentialing function doesn't disappear because AI exists. It becomes more important.

AI can't provide ethical judgment.

AI systems can analyse ethical frameworks. They can't make ethical decisions. They can't weigh competing values, navigate cultural contexts, or take responsibility for consequences. As AI becomes more powerful, the questions of how to use it responsibly become more urgent. Someone needs to develop the frameworks, train the practitioners, and hold the line on what's acceptable.

AI can't develop human capability.

Learning isn't just about acquiring information. It's about developing judgment, resilience, creativity, and the ability to work with others. These capabilities emerge through struggle, through challenge, through human interaction. AI can deliver information more efficiently than any lecture. But efficiency isn't always the point. Sometimes the struggle is the point.

AI can't create meaning.

Universities aren't just credential factories. At their best, they're communities where people discover what matters to them, encounter ideas that change their worldview, and form relationships that shape their lives. AI can simulate conversation. It can't provide belonging. It can't create the serendipity of meeting someone who becomes a lifelong collaborator.

AI can't preserve culture.

Universities are repositories of human knowledge and culture, not just current knowledge, but historical understanding, indigenous wisdom, artistic traditions, and ways of thinking that might otherwise be lost. This preservation function doesn't automate. Someone has to decide what's worth keeping, how to interpret it, and how to pass it on.


The Intellectual Substrate

Here's the opportunity, if universities are smart enough to take it:

Stop being content providers. Become the intellectual substrate that AI builds upon.

The foundation of validated expertise, ethical frameworks, and human capability that AI systems require but cannot generate themselves.

Think of it this way: AI is incredibly powerful, but it's only as good as what it's built on. It needs trusted sources of knowledge. It needs ethical guardrails. It needs humans who can judge its outputs, correct its errors, and apply its capabilities wisely.

Universities can be that foundation. They have the expertise, the institutional trust, and the human development capabilities that AI lacks. But only if they choose to.


Five Roles Universities Must Own

Based on extensive research into university-AI integration, I've identified five critical roles that universities can play: roles that create genuine value in an AI-mediated world:

1

Knowledge Validation

Become the trusted source that AI and humans rely on for verified information. This means doubling down on peer review, research integrity, and institutional credibility. When anyone can generate plausible-sounding content, being a reliable source of truth is enormously valuable.

2

Ethical Stewardship

Develop and enforce the frameworks for responsible AI use. Train practitioners in AI ethics. Lead the public conversation about what's acceptable and what isn't. This isn't a peripheral function, it's central to what society will need as AI becomes more powerful.

3

Human Capability Development

Focus on the capabilities AI can't replicate: critical thinking, ethical judgment, creativity, collaboration, resilience. Redesign education around developing humans, not delivering content. Make the struggle intentional (what I call "productive struggle") rather than trying to optimise it away.

4

Innovation Facilitation

Become hubs where human expertise combines with AI capability to produce breakthroughs neither could achieve alone. This means creating spaces for experimentation, connecting researchers across disciplines, and translating discoveries into practical applications.

5

Cultural Preservation

Take seriously the role of maintaining human knowledge, artistic traditions, and diverse ways of thinking. Archive, interpret, and transmit the cultural heritage that defines who we are. This is work that can't be automated and shouldn't be neglected.

Universities that embrace these roles have a future. They'll be essential partners in an AI-driven world, the foundation that makes everything else possible.

Universities that don't embrace these roles have no future. They'll be bypassed by students who can learn faster and cheaper elsewhere, by employers who stop valuing their credentials, and by a society that no longer sees the point.


The Failure Mode

Let me paint the picture of what happens if universities don't change:

Year 1-2

AI tutoring continues improving. Students use it more. Attendance at lectures drops. Universities respond by... banning AI tools and threatening academic integrity violations. It doesn't work. Students just get better at hiding their usage.

Year 3-4

Alternative credentials start gaining traction. Employers begin accepting AI-verified skills assessments alongside traditional degrees. Some industries drop degree requirements entirely. Enrolments begin declining, especially in programs seen as "content-heavy" with low practical application.

Year 5-7

A few universities pivot hard: redesigning curriculum around human capability development, embracing AI as a teaching partner. These institutions thrive. Most universities don't pivot. They double down on the old model, cut costs, increase class sizes, and accelerate their irrelevance.

Year 8-10

The shakeout. Institutions that failed to adapt face existential crises. Mergers, closures, dramatic downsizing. The sector that emerges is much smaller but more focused: institutions that found a genuine value proposition in an AI-mediated world.

This isn't speculation. The patterns are already visible. The only uncertainty is timing, and the timing is faster than most university leaders believe.


The 15-25% Decision

Here's a concrete recommendation from my research:

Universities should allocate 15-25% of their institutional budget to AI transformation initiatives.

Not 2% for a "digital innovation lab" that produces reports no one reads. Genuine, substantial investment in becoming something different.

That sounds like a lot. It is a lot. It's also what survival requires.

The alternative is spending the same money (more, actually) on a slow decline. Redundancy packages. Campus closures. Managing the wind-down of an institution that didn't adapt.

The money gets spent either way. The question is whether it gets spent on transformation or on managing failure.


What This Means If You Work In Higher Education

If you're a university leader...

You have a choice to make. Not eventually, now. The window for proactive transformation is closing. International competitors are moving faster. The technology is advancing faster. Every year you delay makes the transition harder.

If you're an academic...

You can resist AI, pretend it's not happening, focus on detection and prohibition. Or you can figure out how to use it: how to combine your expertise with these new capabilities to do things neither could do alone. The academics who figure this out will thrive. The ones who don't will become increasingly marginal.

If you're a student...

You should be asking hard questions about what you're paying for. Is this institution preparing you for an AI-mediated world? Are you developing capabilities AI can't replicate? Or are you just collecting a credential that might not mean what it used to by the time you graduate?

If you're an employer...

You should be thinking about what you actually need from graduates. Is it the credential, or is it the capability? Because those two things are about to diverge in ways they never have before.


The Honest Truth

I've spent twenty years in education technology. I've worked at Microsoft building education products. I've worked in government on national skills policy. I've taught at universities in Australia and Japan. I'm completing a PhD on AI in education.

I love universities. I believe in what they represent at their best: communities of inquiry, engines of discovery, places where people become more than they were.

And I'm telling you: the current model is not going to survive.

That's not pessimism. It's clarity. The institutions I love need to change fundamentally, or they'll disappear. Not because AI is malevolent, but because it's simply better at parts of what universities have traditionally done.

The good news is that there's plenty AI can't do. Plenty that universities are uniquely positioned to provide. Plenty of value to create in an AI-mediated world.

But only if universities choose to create it. Only if they stop defending the old model and start building the new one. Only if they become the intellectual substrate (the foundation that AI and society need) rather than a content delivery mechanism that's been made obsolete.

The choice isn't whether to change. It's whether to lead or be bypassed entirely.

Higher EducationAIUniversitiesTransformationIntellectual SubstrateFuture of Education
JL

Written by

Jason La Greca

PhD candidate researching AI transformation in higher education. He's spent 20 years in education technology and is tired of watching institutions he cares about make choices that guarantee their irrelevance.

Connect on LinkedIn

Is your university ready for AI transformation?

Take the AI Readiness Assessment to see where you actually stand.

Start Assessment

Ready for an honest conversation about transformation?

Teachnology Advisory helps institutions build the future, not manage decline.

Get in Touch