Greetings & Welcome to the Microsoft Q&A forum! Thank you for sharing your query.
I understand that you're encountering an issue with Issues Deploying Endpoints for Custom HF and Preset Azure Mistral Models in Azure ML
When deploying your custom Hugging Face model, the container crash could be related to several issues. First, ensure that all required libraries, including azureml-inference-server-http
are correctly added to your environment. If the logs indicate that this library is missing, double-check your environment configuration.
For loading your model from the AZUREML_MODEL_DIR
, ensure that your scoring script is correctly set up to access this environment variable. The model should be registered in the same workspace where you are deploying the endpoint. You can verify this by checking the model registration and ensuring that the correct version is being referenced in your deployment.
Regarding the Azure Mistral 3B Finetuned model, since it is a preset asset and not a fully downloadable artifact, you may face mounting/loading issues. In this case, ensure that you are following the correct procedures for accessing preset models in Azure ML, as they may require specific configurations or permissions that differ from custom models.
Can you please refer these https://github.com/agademic/custom-models-on-azure/blob/master/deploy_model_sdk_v2.ipynb
Debug scoring scripts with Azure Machine Learning inference HTTP server
I hope these helps you. Thank you!