Επεξεργασία

Κοινή χρήση μέσω


Supported programming languages for models in Azure AI Model Inference

Models deployed in Azure AI Model Inference can be used with different SDKs and programming models. The following document describes which one to use:

All models

All models deployed to Azure AI model inference support the Azure AI model inference API and its associated family of SDKs.

To use these SDKs, connect them to the Azure AI model inference URI (usually in the form https://<resource-name>.services.ai.azure.com/models).

Azure AI Inference package

The Azure AI Inference package allows you to consume all models deployed to the Azure AI model inference service and easily change among them. Azure AI Inference package is part of the Azure AI Foundry SDK.

Language Documentation Package Examples
C# Reference azure-ai-inference (NuGet) C# examples
Java Reference azure-ai-inference (Maven) Java examples
JavaScript Reference @azure/ai-inference (npm) JavaScript examples
Python Reference azure-ai-inference (PyPi) Python examples

Azure AI Projects package

The Azure AI Projects package allows customer to access a comprehensive set of functionalities from an Azure AI project. Those capabilities include Azure AI model inference, but also advanced capabilities like tracing, evaluation, and data storage. Azure AI Projects package is part of the Azure AI Foundry SDK and leverages the Azure AI Inference package and Azure OpenAI package to perform inference depending on users needs.

Language Documentation Package Examples
C# Reference Azure.AI.Projects (NuGet) C# examples
Python Reference azure-ai-projects (PyPi) Python examples

Integrations

Framework Language Documentation Package Examples
LangChain Python Reference langchain-azure-ai (PyPi) Python examples
Llama-Index Python Reference llama-index-llms-azure-inference (PyPi)
llama-index-embeddings-azure-inference (PyPi)
Python examples
Semantic Kernel Python Reference semantic-kernel[azure] (PyPi) Python examples
AutoGen Python Reference autogen-ext[azure] (PyPi) Quickstart

Azure OpenAI models

Azure OpenAI models can be consumed using the following SDKs and programming languages.

To use these SDKs, connect them to the Azure OpenAI service URI (usually in the form https://<resource-name>.openai.azure.com).

Language Source code Package Examples
C# Source code Azure.AI.OpenAI (NuGet) C# examples
Go Source code azopenai (Go) Go examples
Java Source code azure-ai-openai (Maven) Java examples
JavaScript Source code @azure/openai (npm) JavaScript examples
Python Source code openai (PyPi) Python examples

Integrations

Framework Language Documentation Package Examples
LangChain Python Reference langchain-openai (PyPi) Python examples
Llama-Index Python Reference llama-index-llms-openai (PyPi)
llama-index-embeddings-openai (PyPi)
Python examples
AutoGen Python Reference autogen-ext[openai] (PyPi) Quickstart

Limitations

Warning

Cohere SDK and Mistral SDK aren't supported in Azure AI Model Inference in Azure AI Foundry.

Next steps

  • To see what models are currently supported, check out the Models section