Hello,
I can suggest you to modify the query logic. You can modify the function that processes the user input. Ensure it incorporates this check before proceeding to invoke the llm for a general answer.
BR,
Dieser Browser wird nicht mehr unterstützt.
Führen Sie ein Upgrade auf Microsoft Edge durch, um die neuesten Features, Sicherheitsupdates und den technischen Support zu nutzen.
Hi guys, we created a rag chatbot in teams. We used the template from teams and used our custom vector-database as datasource. our problem is that we dont want the chat to create a general answer from llm, when no chunks are returned from data source. How can we control that?
Hello,
I can suggest you to modify the query logic. You can modify the function that processes the user input. Ensure it incorporates this check before proceeding to invoke the llm for a general answer.
BR,