Issue with Copilot Agent Behavior - Need Feedback

Swami Nawale 20 Reputation points
2025-03-06T09:58:31.6666667+00:00

I’m building an autonomous agent for a helpdesk system, and I'm running into a bit of an issue with how it’s responding.

When I ask "I'm facing 1 issue with my laptop", then instead of asking for more details or clarifying the problem, the agent immediately asks if they want to create a helpdesk ticket. I was expecting the agent to ask follow-up questions like, "What issue are you facing?" to better understand the problem before jumping to ticket creation (autonomous action).

I've enabled generative AI for better context understanding, but it seems like the agent isn’t adapting or asking the right follow-up questions even I specifed in the instructions to ask follow-up questions in cases like this. it seems like the agent isn't learning from the feedback or adjusting its responses as expected.

Has anyone else worked on building or improving similar autonomous agents and I just wanted to know if this is how Copilot agents are typically designed to behave like no cross questioning or feedbacks?

I found a discussion but there is no reponse:

https://learn.microsoft.com/en-us/answers/questions/2147374/how-do-copilot-agents-learn-and-respond-to-feedbac

Microsoft Copilot
Microsoft Copilot
Microsoft terminology for a universal copilot interface.
650 questions
0 comments No comments
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.