Hello Choi, Jaehyeok,
Welcome to the Microsoft Q&A and thank you for posting your questions here.
I understand that you would like to fix an error occurred when calling Azure OpenAI: Server responded with status 400. Error message - as stated in your post.
This is a model compatibility in Azure AI Foundry, the error message indicates that the model "chatgpt" is not recognized, which might be causing the problem when you try to add your data index.
You will need to verify model availability and check model configuration, especially your data configuration to know the exact name and mode of Llama. If you are sure everything is okay or using a custom model or a specific version, ensure that all references to the model are updated and consistent across your configuration. Because the data index setup is referencing an invalid model name (chatgpt).
By explicitly specifying the correct Llama deployment name and ensuring compatibility with search types, you bypass the incorrect default reference. If Keyword search is unsupported for Llama, switching to Vector search. Also, refer to the Azure AI Foundry documentation for any specific requirements or limitations related to the Llama-3.3-70B-Instruct-2 model and data indexing - https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-llama
I hope this is helpful! Do not hesitate to let me know if you have any other questions.
Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.