Building AI assistants on Microsoft's no-code platform should be easier than traditional development. That's the promise. The reality is different.
After 40+ hours building CohortAI (an AI course assistant for universities) I can tell you that Copilot Studio has code-level complexity hiding behind a point-and-click interface.
These aren't edge cases. These are common issues that will stop your project dead. I'm documenting them because nobody else has.
Gotcha #1: Power Fx Formula Mode Doesn't Work the Way You Think
What happened: I spent two hours debugging a Function App that wasn't broken. Every request to my Azure Functions returned error responses, but the same HTTP calls worked perfectly from Postman. Students were getting “Sorry, something went wrong” instead of tutoring responses.
Why it's confusing: The Copilot Studio HTTP Request node has two body modes: JSON and Formula. The JSON editor looks like every REST client you've used — you write JSON, insert variables, everything works as expected. That's not what you should use.
When you select JSON mode and write something like:
{
"email": "{{System.User.PrincipalName}}",
"message": "{{Topic.UserMessage}}"
}Copilot Studio sends those curly braces as literal strings. Your Function App receives:
{
"email": "{{System.User.PrincipalName}}",
"message": "{{Topic.UserMessage}}"
}How to fix it: Use Formula mode, not JSON mode. Click the “Edit JSON” button, then switch to the Formula tab. Write Power Fx expressions without any curly braces:
{email: System.User.PrincipalName, message: Topic.UserMessage}This sends actual values, not variable names wrapped in meaningless braces.
The documentation doesn't make this clear. Every REST API tutorial teaches you to use curly braces for variables. Copilot Studio quietly ignores that convention and breaks your requests.
Gotcha #2: Reserved Words Kill Variables Silently
What happened: I created a variable called topic to store the course topic a student was asking about. The Topic worked in testing but failed completely when students used it. The variable was always empty, even when students clearly specified a topic.
Why it's confusing: Power Fx has reserved words, but Copilot Studio doesn't warn you when you use them as variable names. The system accepts topic as a variable name in the interface. You can save it, test it, and deploy it. It fails silently in production because topic is a reserved word in the underlying Power Fx engine.
How to fix it: Wrap reserved words in single quotes: 'topic'. Or better, use different variable names entirely. I changed mine to course_topic and the problem disappeared.
The reserved words list isn't obvious. Common ones that will bite you:
topic(the killer for education bots)usersystemactionresponse
Test every variable name in a simple Topic before using it in complex workflows. If a variable mysteriously returns empty values, suspect a reserved word collision.
Gotcha #3: Old Actions Intercept Everything
What happened: I imported an OpenAPI spec into Copilot Studio to generate Actions automatically. Later, I decided to use Topics instead for better control. I created 10 Topics, each handling a different capability (tutoring, quizzes, assessments). None of them triggered. Students typed “help me with physics” and got blank responses.
Why it's confusing: When you import an OpenAPI spec, Copilot Studio creates Actions in the background. These Actions have their own trigger phrases and routing logic. When you later create Topics, the old Actions fire first and intercept requests before your Topics get a chance to respond.
There's no obvious visual indicator that this is happening. Your Topics look correctly configured. The test chat works. But in production, students trigger the wrong code path.
How to fix it: Delete all old Actions before creating Topics. Go to the Actions section in Copilot Studio, find anything that was auto-generated from your OpenAPI import, and delete it completely. Then test to confirm your Topics are responding.
This isn't documented as something to watch for. Most tutorials assume you're building from scratch, not migrating from an API-first approach.
Gotcha #4: Timeout Optimisation for Constraints That Don't Exist
What happened: My GPT-4o-powered tutoring function takes 8–15 seconds to generate responses with RAG (retrieval-augmented generation). I spent an hour optimising the function, caching results, reducing context length, and parallelising API calls to get under what I assumed was a 10-second timeout limit in Copilot Studio.
Why it's confusing: Every enterprise platform has timeouts. API gateways default to 10–30 seconds. Azure Functions have timeouts. Copilot Studio must have timeouts too, right? When my tutoring responses occasionally failed, I assumed timeout was the cause.
How to fix it: Check if the constraint is actually fixed before optimising for it. Copilot Studio timeouts are configurable per Topic. You can set them as high as 60 seconds. I could have just changed a setting instead of spending an hour micro-optimising code.
The real issue wasn't timeout — it was intermittent GPT-4o API throttling during high usage periods. A simple retry loop would have fixed it.
Don't optimise for constraints that might not exist. Check the platform defaults first.
Gotcha #5: Response Variable Naming Breaks Everything
What happened: I configured my HTTP Request nodes to save responses in variables named things like TutorResponse, QuizResponse, and AssessmentResponse. The system showed successful HTTP calls in the testing interface, but the Message nodes that displayed results to students were empty.
Why it's confusing: The variable naming in Copilot Studio follows a pattern that isn't immediately obvious. When you save an HTTP response, the response data gets stored in a structured object, not a flat variable. If you save the response as TutorResponse, the actual message content is in TutorResponse.response or TutorResponse.display_text — not in TutorResponse itself.
How to fix it: Use a consistent naming pattern and understand the response structure. I settled on saving all HTTP responses as Topic.http_response, then accessing the content via Topic.http_response.display_text in Message nodes.
The structure depends on what your Function App returns. If your function returns:
{
"response": "The answer is...",
"display_text": "Formatted response with sources"
}Then you access it via Topic.http_response.display_text, not Topic.http_response.
Test the response structure in a simple Topic first. Use the {x} button in Message nodes to insert variables correctly — don't type them as text.
The Pattern Behind the Gotchas
These aren't random bugs. They're symptoms of a deeper issue: Copilot Studio is a no-code tool built on top of code-level technologies (Power Fx, HTTP APIs, JSON schemas) without fully abstracting away the complexity.
You get the worst of both worlds — the limitations of a visual interface plus the debugging complexity of writing code. When things break, you need to understand Power Fx variable scope, HTTP request formatting, JSON structure, and API response handling. But you debug through a point-and-click interface that doesn't give you direct access to the underlying code.
The solution isn't to avoid Copilot Studio. It's a genuinely useful prototyping tool. The solution is to understand its limitations and plan your migration path upfront.
For CohortAI, we built v1 in Copilot Studio with 10 manually-configured Topics. It worked for demos. For production, we migrated to Azure AI Foundry Agent Service, where one system prompt replaced all 10 Topics and intelligent routing replaced manual configuration.
The hard work — the Azure Functions, the data models, the content indexing — stayed the same. We just swapped out the orchestration layer.
If you're building on Copilot Studio, expect these gotchas. Budget time for them. And have a plan for what comes next when you outgrow the platform.
Want to Pilot This at Your University?
CohortAI is an AI-powered course assistant built on the Microsoft stack. If you're exploring AI tutoring, early warning systems, or curriculum analytics for your institution, let's talk.
Book an Advisory ConversationWritten by
Jason La Greca
Founder of Teachnology. Building AI that empowers humans, not replaces them.
Connect on LinkedInIs your organisation building capability or just buying it?
Take the free 12-minute Capability Assessment and find out where you stand. Get a personalised report with actionable recommendations.