Azure OpenAI
If you are using Quarkus, please refer to the Quarkus LangChain4j documentation.
Azure OpenAI provides a few language models (gpt-35-turbo
, gpt-4
, gpt-4o
, etc.)
that can be used for various natural language processing tasks.
Azure OpenAI Documentation
Maven Dependency
Plain Java
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-azure-open-ai</artifactId>
<version>0.36.2</version>
</dependency>
Spring Boot
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-azure-open-ai-spring-boot-starter</artifactId>
<version>0.36.2</version>
</dependency>
Before using any of the Azure OpenAI models, you need to deploy them.
Creating AzureOpenAiChatModel
with an API Key
Plain Java
ChatLanguageModel model = AzureOpenAiChatModel.builder()
.apiKey(System.getenv("AZURE_OPENAI_KEY"))
.deploymentName("gpt-4o")
.endpoint("https://langchain4j.openai.azure.com/")
...
.build();
This will create an instance of AzureOpenAiChatModel
with default model parameters (e.g. 0.7
temperature, etc.)
and an API key stored in the AZURE_OPENAI_KEY
environment variable.
Default model parameters can be customized by providing values in the builder.
Spring Boot
Add to the application.properties
:
langchain4j.azure-open-ai.chat-model.endpoint=https://langchain4j.openai.azure.com/
langchain4j.azure-open-ai.chat-model.service-version=...
langchain4j.azure-open-ai.chat-model.api-key=${AZURE_OPENAI_KEY}
langchain4j.azure-open-ai.chat-model.non-azure-api-key=${OPENAI_API_KEY}
langchain4j.azure-open-ai.chat-model.deployment-name=gpt-4o
langchain4j.azure-open-ai.chat-model.max-tokens=...
langchain4j.azure-open-ai.chat-model.temperature=...
langchain4j.azure-open-ai.chat-model.top-p=
langchain4j.azure-open-ai.chat-model.logit-bias=...
langchain4j.azure-open-ai.chat-model.user=
langchain4j.azure-open-ai.chat-model.stop=...
langchain4j.azure-open-ai.chat-model.presence-penalty=...
langchain4j.azure-open-ai.chat-model.frequency-penalty=...
langchain4j.azure-open-ai.chat-model.seed=...
langchain4j.azure-open-ai.chat-model.timeout=...
langchain4j.azure-open-ai.chat-model.max-retries=...
langchain4j.azure-open-ai.chat-model.log-requests-and-responses=...
langchain4j.azure-open-ai.chat-model.user-agent-suffix=
langchain4j.azure-open-ai.chat-model.custom-headers=...
See the description of some of the parameters above here.
This configuration will create an AzureOpenAiChatModel
bean (with default model parameters),
which can be either used by an AI Service
or autowired where needed, for example:
@RestController
class ChatLanguageModelController {
ChatLanguageModel chatLanguageModel;
ChatLanguageModelController(ChatLanguageModel chatLanguageModel) {
this.chatLanguageModel = chatLanguageModel;
}
@GetMapping("/model")
public String model(@RequestParam(value = "message", defaultValue = "Hello") String message) {
return chatLanguageModel.generate(message);
}
}
Creating AzureOpenAiChatModel
with Azure Credentials
API key can have a few security issues (can be committed, can be passed around, etc.).
If you want to improve security, it is recommended to use Azure Credentials instead.
For that, it is necessary to add the azure-identity
dependency to the project.
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<scope>compile</scope>
</dependency>
Then, you can create an AzureOpenAiChatModel
using the DefaultAzureCredentialBuilder API:
ChatLanguageModel model = AzureOpenAiChatModel.builder()
.deploymentName("gpt-4o")
.endpoint("https://langchain4j.openai.azure.com/")
.tokenCredential(new DefaultAzureCredentialBuilder().build())
.build();
Notice that you need to deploy your model using Managed Identities. Check the Azure CLI deployment script for more information.
Creating AzureOpenAiStreamingChatModel
Plain Java
StreamingChatLanguageModel model = AzureOpenAiStreamingChatModel.builder()
.apiKey(System.getenv("AZURE_OPENAI_KEY"))
.deploymentName("gpt-4o")
.endpoint("https://langchain4j.openai.azure.com/")
...
.build();
Spring Boot
Add to the application.properties
:
langchain4j.azure-open-ai.streaming-chat-model.endpoint=https://langchain4j.openai.azure.com/
langchain4j.azure-open-ai.streaming-chat-model.service-version=...
langchain4j.azure-open-ai.streaming-chat-model.api-key=${AZURE_OPENAI_KEY}
langchain4j.azure-open-ai.streaming-chat-model.deployment-name=gpt-4o
langchain4j.azure-open-ai.streaming-chat-model.max-tokens=...
langchain4j.azure-open-ai.streaming-chat-model.temperature=...
langchain4j.azure-open-ai.streaming-chat-model.top-p=...
langchain4j.azure-open-ai.streaming-chat-model.logit-bias=...
langchain4j.azure-open-ai.streaming-chat-model.user=...
langchain4j.azure-open-ai.streaming-chat-model.stop=...
langchain4j.azure-open-ai.streaming-chat-model.presence-penalty=...
langchain4j.azure-open-ai.streaming-chat-model.frequency-penalty=...
langchain4j.azure-open-ai.streaming-chat-model.seed=...
langchain4j.azure-open-ai.streaming-chat-model.timeout=...
langchain4j.azure-open-ai.streaming-chat-model.max-retries=...
langchain4j.azure-open-ai.streaming-chat-model.log-requests-and-responses=...
langchain4j.azure-open-ai.streaming-chat-model.user-agent-suffix=...
langchain4j.azure-open-ai.streaming-chat-model.customHeaders=...
Creating AzureOpenAiTokenizer
Plain Java
Tokenizer tokenizer = new AzureOpenAiTokenizer();
// or
Tokenizer tokenizer = new AzureOpenAiTokenizer("gpt-4o");
Spring Boot
The AzureOpenAiTokenizer
bean is created automatically by the Spring Boot starter.
APIs
AzureOpenAiChatModel
AzureOpenAiStreamingChatModel
DefaultAzureCredentialBuilder
AzureOpenAiTokenizer