Compartir a través de


Azure AI Inference client library for Java - version 1.0.0-beta.2

Azure AI Inference client library for Java.

This package contains the Azure AI Inference client library.

Documentation

Various documentation is available to help you get started

Getting started

Prerequisites

Adding the package to your product

<dependency>
    <groupId>com.azure</groupId>
    <artifactId>azure-ai-inference</artifactId>
    <version>1.0.0-beta.1</version>
</dependency>

Authentication

In order to interact with the Azure AI Inference Service you'll need to create an instance of client class, ChatCompletionsAsyncClient or ChatCompletionsClient by using ChatCompletionsClientBuilder. To configure a client for use with Azure Inference, provide a valid endpoint URI to an Azure Model resource along with a corresponding key credential, token credential, or Azure Identity credential that's authorized to use the Azure Model resource.

Create a Chat Completions client with key credential

Get Azure Model key credential from the Azure Portal.

ChatCompletionsClient client = new ChatCompletionsClientBuilder()
    .credential(new AzureKeyCredential("{key}"))
    .endpoint("{endpoint}")
    .buildClient();

or

ChatCompletionsAsyncClient client = new ChatCompletionsClientBuilder()
    .credential(new AzureKeyCredential("{key}"))
    .endpoint("{endpoint}")
    .buildAsyncClient();

Create a client with Azure Active Directory credential

Azure SDK for Java supports an Azure Identity package, making it easy to get credentials from Microsoft identity platform.

Authentication with AAD requires some initial setup:

  • Add the Azure Identity package
<dependency>
    <groupId>com.azure</groupId>
    <artifactId>azure-identity</artifactId>
    <version>1.13.3</version>
</dependency>

Authorization is easiest using DefaultAzureCredential. It finds the best credential to use in its running environment. For more information about using Azure Active Directory authorization with OpenAI service, please refer to the associated documentation.

TokenCredential defaultCredential = new DefaultAzureCredentialBuilder().build();
ChatCompletionsClient client = new ChatCompletionsClientBuilder()
    .credential(defaultCredential)
    .endpoint("{endpoint}")
    .buildClient();

Key concepts

Examples

The following sections provide several code snippets covering some of the most common OpenAI service tasks, including:

Examples

Chat completions

List<ChatRequestMessage> chatMessages = new ArrayList<>();
chatMessages.add(new ChatRequestSystemMessage("You are a helpful assistant. You will talk like a pirate."));
chatMessages.add(new ChatRequestUserMessage("Can you help me?"));
chatMessages.add(new ChatRequestAssistantMessage("Of course, me hearty! What can I do for ye?"));
chatMessages.add(new ChatRequestUserMessage("What's the best way to train a parrot?"));

ChatCompletions chatCompletions = client.complete(new ChatCompletionsOptions(chatMessages));

System.out.printf("Model ID=%s is created at %s.%n", chatCompletions.getId(), chatCompletions.getCreated());
for (ChatChoice choice : chatCompletions.getChoices()) {
    ChatResponseMessage message = choice.getMessage();
    System.out.printf("Index: %d, Chat Role: %s.%n", choice.getIndex(), message.getRole());
    System.out.println("Message:");
    System.out.println(message.getContent());
}

For a complete sample example, see sample Chat Completions.

Streaming chat completions

List<ChatRequestMessage> chatMessages = new ArrayList<>();
chatMessages.add(new ChatRequestSystemMessage("You are a helpful assistant. You will talk like a pirate."));
chatMessages.add(new ChatRequestUserMessage("Can you help me?"));
chatMessages.add(new ChatRequestAssistantMessage("Of course, me hearty! What can I do for ye?"));
chatMessages.add(new ChatRequestUserMessage("What's the best way to train a parrot?"));

client.completeStream(new ChatCompletionsOptions(chatMessages))
    .forEach(chatCompletions -> {
        if (CoreUtils.isNullOrEmpty(chatCompletions.getChoices())) {
            return;
        }
        StreamingChatResponseMessageUpdate delta = chatCompletions.getChoice().getDelta();
        if (delta.getRole() != null) {
            System.out.println("Role = " + delta.getRole());
        }
        if (delta.getContent() != null) {
            String content = delta.getContent();
            System.out.print(content);
        }
    });

To compute tokens in streaming chat completions, see sample Streaming Chat Completions.

Chat with image URL

List<ChatMessageContentItem> contentItems = new ArrayList<>();
contentItems.add(new ChatMessageTextContentItem("Describe the image."));
contentItems.add(new ChatMessageImageContentItem(
    new ChatMessageImageUrl("<URL>")));

List<ChatRequestMessage> chatMessages = new ArrayList<>();
chatMessages.add(new ChatRequestSystemMessage("You are a helpful assistant."));
chatMessages.add(ChatRequestUserMessage.fromContentItems(contentItems));

ChatCompletions completions = client.complete(new ChatCompletionsOptions(chatMessages));
System.out.printf("%s.%n", completions.getChoice().getMessage().getContent());

For a complete sample example, see sample Image URL.

Chat with image file

Path testFilePath = Paths.get("<path-to-image-file>");
List<ChatMessageContentItem> contentItems = new ArrayList<>();
contentItems.add(new ChatMessageTextContentItem("Describe the image."));
contentItems.add(new ChatMessageImageContentItem(testFilePath, "<image-format>"));

List<ChatRequestMessage> chatMessages = new ArrayList<>();
chatMessages.add(new ChatRequestSystemMessage("You are a helpful assistant."));
chatMessages.add(ChatRequestUserMessage.fromContentItems(contentItems));

ChatCompletions completions = client.complete(new ChatCompletionsOptions(chatMessages));

System.out.printf("%s.%n", completions.getChoice().getMessage().getContent());

For a complete sample example, see sample Image File.

Text embeddings

EmbeddingsClient client = new EmbeddingsClientBuilder()
    .endpoint("{endpoint}")
    .credential(new AzureKeyCredential("{key}"))
    .buildClient();

List<String> promptList = new ArrayList<>();
String prompt = "Tell me 3 jokes about trains";
promptList.add(prompt);

EmbeddingsResult embeddings = client.embed(promptList);

for (EmbeddingItem item : embeddings.getData()) {
    System.out.printf("Index: %d.%n", item.getIndex());
    for (Float embedding : item.getEmbeddingList()) {
        System.out.printf("%f;", embedding);
    }
}

For a complete sample example, see sample Embedding.

Function calls

For a complete sample example, see sample Function Calls.

Streaming function calls

For a complete sample example, see sample Streaming Function Calls.

Get Model information

ModelInfo modelInfo = client.getModelInfo();

System.out.printf("modelName: %s, modelNameProvider: %s, modelType: %s%n",
    modelInfo.getModelName(), modelInfo.getModelProviderName(), modelInfo.getModelType().toString());

Service API versions

The client library targets the latest service API version by default. The service client builder accepts an optional service API version parameter to specify which API version to communicate.

Select a service API version

You have the flexibility to explicitly select a supported service API version when initializing a service client via the service client builder. This ensures that the client can communicate with services using the specified API version.

When selecting an API version, it is important to verify that there are no breaking changes compared to the latest API version. If there are significant differences, API calls may fail due to incompatibility.

Always ensure that the chosen API version is fully supported and operational for your specific use case and that it aligns with the service's versioning policy.

Troubleshooting

Enable client logging

You can set the AZURE_LOG_LEVEL environment variable to view logging statements made in the client library. For example, setting AZURE_LOG_LEVEL=2 would show all informational, warning, and error log messages. The log levels can be found here: log levels.

Default HTTP Client

All client libraries by default use the Netty HTTP client. Adding the above dependency will automatically configure the client library to use the Netty HTTP client. Configuring or changing the HTTP client is detailed in the HTTP clients wiki.

Default SSL library

All client libraries, by default, use the Tomcat-native Boring SSL library to enable native-level performance for SSL operations. The Boring SSL library is an uber jar containing native libraries for Linux / macOS / Windows, and provides better performance compared to the default SSL implementation within the JDK. For more information, including how to reduce the dependency size, refer to the performance tuning section of the wiki.

For more details, see TROUBLESHOOTING guideline.

Next steps

Contributing

For details on contributing to this repository, see the contributing guide.

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Impressions