Skip to main content
Back to Insights
AI Strategy12 min read10 February 2026

What I'd Tell Steven Bartlett About AI Employees

On December 30, I cried during Nessun Dorma at the Opera House. Two days later, my AI employee scanned 14 research papers while I slept. Both things are true. Understanding how they fit together is the most important question facing anyone building a company right now.

Share:

On December 30, I sat in the Sydney Opera House and cried during Nessun Dorma.

Full disclosure: I'm a classically trained guitarist. Flamenco, the lot. I got accepted into the Sydney Conservatorium of Music, where my son now plans to study. But my family preferred I find a "more stable career," so I became a teacher instead. Spent the next twenty years listening to Stone Temple Pilots, Soundgarden, and Living Colour while pretending the classical training didn't matter. My son plays drums. I own more retro gaming consoles than I'm willing to admit in writing. The last person you'd expect to be undone by an Italian aria in a concert hall.

But there I was. Holding back tears while a human voice did something that no algorithm, no model, no compute cluster on earth can do. It carried a lifetime of lived experience from one human body across a room and planted it directly in my chest.

Full house. Families everywhere, which made me unreasonably happy. My wife's whole family had come over from China and we'd brought everyone along. The subtle imperfections of the Opera House acoustics. The footsteps. A tenor's breath between phrases. The tiniest things became the biggest things. We were all blown away. My son and I talked the entire drive home trying to explain what we'd just experienced. We couldn't. We failed to find the words. And that failure to describe it was the proof that it was real.

I'm telling you this because two days later, at 1am, while I slept, my AI employee scanned 14 research papers, drafted prioritised points for a newsletter, flagged a security vulnerability on my website, and left me a strategic briefing to read over coffee.

It cost about twelve cents.

Both of these things are true at the same time. And understanding how they fit together is, I think, the most important question facing anyone building a company right now.

The employee that never clocks off

Let me be specific about what "AI employee" means because most people hear it and think chatbot.

This is not a chatbot.

It's an AI agent with defined responsibilities, persistent memory across sessions, specialist skills, scheduled overnight work, and output I review each morning like any manager would. It has a workspace. It commits code. It pushes to GitHub. It writes newsletters, processes research, monitors websites, builds tools, and produces strategic briefings. It learns from its mistakes and keeps notes for future sessions. This was started at a time well before OpenClaw, and using a stack that is far more secure and scalable (just FYI).

Here's what it actually did last week:

Monday 2am: Intelligence scan across education news, teacher forums, competitor products, Reddit, YouTube videos I feed it, articles I read or don't have time to read. By the time I woke up: prioritised briefing with strategic implications mapped.

Tuesday: Drafted points for five newsletter editions. I spent 15 minutes reviewing each. Research and drafting that used to take two hours per issue now happens while I sleep. That same night, it created a Kanban app for me. Two-way. As in, I can now assign tasks, or re-assign with feedback and my employee will just prioritise the work.

Wednesday: Built a complete video processing pipeline. Drop in a long-form video, out come five social-ready shorts with subtitles and metadata. One session. It has since expanded this, creating a full production line where my ideas and writings can be transformed into beautifully formatted PDFs. Pretty cool. I love to write, and write all my own "words", but I hate formatting and typesetting. Problem solved, and I didn't even ask for it. My employee figured that bit out themselves.

Thursday 4am: Found a broken form on my landing page that had been silently losing leads for three days. Fixed it. Left me a note with step by step on what to deploy, and wrote a skill file so the same issues never happen again. That same night, I had given my employee secure access to my Google Analytics data (via a middleware data layer it helped me create in AWS), my token spend, and a range of other metrics. It now tracks everything and provides feedback daily on what seems to trend and why.

Friday: My agent fact-checked a 25,000-word higher education report I had written the hard way with data from SemiAnalysis, Challenger Gray, Forbes, and primary research. The kind of work a consulting firm charges $50K for (well probably more TBH). It called me out on some exaggerations and gave me more data to add in without being asked!

I'm a former high school teacher from the Blue Mountains. I run a small education technology company. One human. One AI employee. Output that would normally require a team of five or more.

But here's what the AI didn't do

It didn't sit in the Opera House and feel Nessun Dorma rearrange something inside it. It didn't watch my son's face during the second movement and understand what was happening behind his eyes. It didn't spend the drive home reaching for words that don't exist.

It didn't play a video game called Clair Obscure: Expedition 33 and get so haunted by the soundtrack that it listened on loop for weeks. It didn't fall down rabbit holes of art history, impressionism, Belle Époque culture, and come out the other side questioning things about itself it had never questioned before.

And it never will.

I've been a gamer my whole life. I develop in Unreal Engine and write C++ by hand for joy. I've played thousands of games. But Clair Obscure did something to me that I wasn't prepared for. The game is beautiful, the story compelling, but it was the music that broke me open. Composed by real musicians drawing on centuries of Western art tradition. Every note carrying the weight of human hands on instruments, human breath through brass, human fingers on strings. The same tradition that connects Puccini to Pearl Jam, Bach to Black Sabbath, Beethoven to Soundgarden.

People don't realise this, but rock is classical music's grandchild. Bach's harmonic structures flow through the blues, which flow through rock and roll, which flow through punk and grunge and prog. When Karnivool build a wall of shifting time signatures in "Themata," they're channelling the same mathematical beauty as Bach's counterpoint. When Living Colour's Vernon Reid shreds through a jazz-metal fusion, he's carrying a through-line from Paganini through Hendrix to something entirely new. When Scott Weiland held a note on "Plush," he was doing what a tenor does with Nessun Dorma. Different language. Same human truth. The melody rises, the voice breaks slightly under the weight of what it's carrying, and something in the listener's chest responds to something in the singer's chest across whatever distance separates them.

That response is tens of thousands of years old. My son plays drums. When he's in flow state behind the kit, when his body disappears and the rhythm takes over, he's connecting to something that predates civilisation. Drums are humanity's oldest instrument. Archaeological evidence puts percussion at 30,000+ years. Before agriculture. Before writing. Before cities. Humans were keeping rhythm together.

That's not a skill. That's DNA.

The question nobody's asking

Steven, you talk a lot about culture. About how the right people in the right environment create magic. About how mindset determines outcome. About how passion can't be faked and conviction can't be hired.

I agree with all of it. But I'd push it further.

Here's the question I think matters most in 2026: When AI handles the production, what are humans actually for?

Not theoretically. Practically. Every day. In your company, in your portfolio, in the teams you're building.

Because right now the conversation about AI is stuck on tools and productivity. Four percent of all code on GitHub is written by AI. Nvidia gave 30,000 engineers AI coding tools and tripled their output. 108,000 US jobs were cut in January. Hiring plans down 58%. The "AI will help workers" narrative is shifting fast toward "AI will replace workers" and most business leaders are frozen between the two.

But both narratives miss the point. The question isn't whether AI replaces humans. It's what humans do when the replacement parts are handled.

And the answer (the one I found in the Opera House, in a video game soundtrack, watching my kid play drums) is this:

Humans feel.

Not "process emotion." Feel. The real thing. The tears during Nessun Dorma. The shivers when a game soundtrack hits the exact frequency of something you didn't know you were carrying. The ache in your chest when your son plays something beautiful and you realise he's growing up and these moments are finite.

AI generates. Humans feel. And the feeling is what makes the generation worth anything.

Three capabilities that AI will never have

I've spent a year working side by side with my AI employee. Testing the boundaries daily. Pushing it further than most people push their human teams. And it always, always comes back to three things:

Accountability. My AI employee can produce a strategic briefing. It can't own the outcome. It can't stand in front of a customer and say "I made this call and here's why." Accountability requires skin in the game. The willingness to be wrong and face the consequences. AI doesn't have skin. It doesn't have consequences. It doesn't lie awake at 2am replaying a decision. That lying awake is the price of accountability, and it's what makes the decision real.

Judgement. When five newsletter angles surface, I choose the one that lands. Not because I'm smarter. Because I understand the audience's emotional state, the timing, the cultural moment, the dozen unspoken signals that determine whether something resonates or disappears. Judgement is pattern recognition grounded in lived experience. The emphasis being on lived. AI can approximate patterns. It can't live through the experiences that make the patterns meaningful.

Taste. The hardest one to explain. The most important one to defend.

Taste is knowing that a technically correct output is still wrong. It's reading a draft and feeling the sentence that breaks the rhythm. It's hearing a vocal performance that's pitch-perfect and knowing it's lifeless. It's watching someone present a strategy that ticks every box and sensing it won't work.

Taste is what I experienced in the Opera House. Those performers weren't just accurate. They were transcendent. The difference between accurate and transcendent is taste, applied through a lifetime of practice, failure, recovery, and accumulated human experience. No model has that. No model will.

The tenor singing Nessun Dorma that night had spent decades training his instrument, which is to say his body. Every performance he'd ever given, every failure, every standing ovation, every night he'd doubted whether he was good enough (all of it was present in every note). That's not data. That's a life. And a full house of families, including my wife's family who'd flown from China, could feel it.

The culture question

You've built your career on the insight that culture eats strategy for breakfast. That the right people with the right mindset create outcomes that no playbook can produce.

Here's where AI employees make that insight more true, not less.

When my AI employee handles the research, the drafting, the monitoring, the code, the scheduling, the formatting, the analysis (what's left for the human is culture work). Judgement calls. Taste decisions. Accountability moments. The stuff that actually determines whether a company matters or just exists.

Most organisations are doing this backwards. They're deploying AI to augment the easy parts (draft this email, summarise this document) while leaving humans trapped in the hard, soul-destroying administrative work that kills creativity and passion. The result: humans do busywork and AI does creative work. Exactly wrong.

The model that works is the opposite. AI handles everything that doesn't require a pulse. Humans do the work that requires feeling, conviction, lived experience, and the willingness to be accountable for outcomes.

That's a culture decision, not a technology decision.

The companies cutting 108,000 jobs treated AI as a cost reduction tool. Nvidia, tripling output with the same headcount, treated AI as a capability amplifier that freed humans for higher work. Same technology. Opposite cultures. Opposite outcomes.

The wisdom gap

Here's the part that should keep every business leader awake.

Mrinank Sharma just resigned as head of safeguards research at Anthropic, the company that built the AI I use as an employee. His resignation letter said: "We appear to be approaching a threshold where our wisdom must grow in equal measure to our capacity to affect the world."

He didn't leave to join another AI company. He left to study poetry.

Let that land. The person responsible for keeping one of the world's most powerful AI systems safe concluded that the missing piece isn't more compute, more data, or more parameters. It's wisdom. And he went looking for it in poetry. In the humanities. In the arts.

Meanwhile, Australia is defunding arts education. $75.6 million for STEM. Zero for arts. 48 creative arts degrees axed. Year 12 arts enrolments collapsed 21%.

We're cutting the exact capabilities that the people building AI are telling us we need most. The wisdom. The taste. The emotional intelligence. The ability to sit in an Opera House and understand what's happening between the singer and the audience in a way that no technical analysis can capture.

This is where culture (real culture) becomes existential. Not company culture. Human culture. The accumulated creative, artistic, musical, literary, philosophical tradition that gives us the judgement and taste to use our tools wisely.

My son playing drums isn't a hobby. It's the development of a capability that AI will never replicate and that the future economy will value above almost everything else. The parents who understand this are giving their children an advantage that no coding bootcamp can match.

The collaboration, not the competition

I'm not anti-AI. I employ one. It's the best hire I've ever made.

But I'm building something specific with it: a system where AI handles production and humans handle meaning. Where the overnight cron jobs produce the raw material and the human applies the taste, judgement, and accountability that turn raw material into something that matters.

This is what I call The Capable Organisation. Not a company that uses AI. A company that has built the internal capability to direct AI while preserving and developing the irreducibly human skills that make the direction worth following.

The capable organisation trains its people in judgement, not just prompting. In taste, not just efficiency. In accountability, not just productivity metrics. It invests in the humanities alongside the technology because it understands that wisdom must match capability or the capability becomes dangerous.

It sends its people to the Opera House, not just to the AI conference. It values the drummer and the developer equally, because it knows they're both building capability that the other needs.

The drive home

My son and I drove home from the Opera House in near silence for the first ten minutes. Not awkward silence. Full silence. The kind where you're both processing something too big for words.

Then he said: "Dad, how do they DO that?"

And I didn't have an answer. Not a real one. I could talk about technique, breath control, years of training. But that wasn't what he was asking. He was asking how a human voice carries something invisible across a room and changes the people who receive it.

I still don't know. I don't think anyone does. I don't think anyone should.

Because the inability to explain it is the proof that it's real. If we could reduce it to a formula, AI could do it. The fact that we can't (the fact that every attempt to describe what happened in that concert hall falls short of what we actually experienced) is the evidence that something irreducibly human is happening.

And that something is what every company, every product, every system, every AI employee should be built to protect, amplify, and serve.

Build the machines. Deploy the agents. Automate the production.

But never forget what the production is for.

It's for the moment when a voice rises in a concert hall and a teenager turns to his father and asks a question that neither of them can answer.

That's what humans are for.


If this piece meant something to you, I'd love to hear about it. And if you're building a company that treats AI as an employee and humanity as the point, we should talk.

AI StrategyLeadershipCultureBusiness
JL

Written by

Jason La Greca

Founder of Teachnology. Building AI that empowers humans, not replaces them.

Connect on LinkedIn

Is your organisation building capability or just buying it?

Take the free 12-minute Capability Assessment and find out where you stand. Get a personalised report with actionable recommendations.

What I'd Tell Steven Bartlett About AI Employees | Insights | Teachnology