Skip to main content

Amazon Bedrock

Maven Dependency

<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-bedrock</artifactId>
<version>1.0.0-beta3</version>
</dependency>

AWS credentials

In order to use Amazon Bedrock models, you need to configure AWS credentials. One of the options is to set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables. More information can be found here.

Difference Between InvokeAPI and ConverseAPI

Amazon Bedrock offers two primary model invocation API operations for inference:

  • Converse – Amazon recommend using the Converse API as it provides consistent API, that works with all Amazon Bedrock models that support messages.
  • InvokeModel – Originally aimed at single calls to obtain a response to a single prompt.

ChatLanguageModel using ConverseAPI

Guardrails is not supported by the current implementation.

Supported models and their features can be found here.

Models ids can be found here.

Configuration

ChatLanguageModel model = BedrockChatModel.builder()
.modelId("us.amazon.nova-lite-v1:0")
.region(...)
.maxRetries(...)
.timeout(...)
.logRequests(...)
.logResponses(...)
.listeners(...)
.defaultRequestParameters(BedrockChatRequestParameters.builder()
.topP(...)
.temperature(...)
.maxOutputTokens(...)
.stopSequences(...)
.toolSpecifications(...)
.additionalModelRequestFields(...)
.build())
.build();

The field additionalModelRequestFields is a Map<String, Object>. As explained here it allows to add inference parameters for a specific model that is not covered by common inferenceConfig. BedrockChatRequestParameters has a convenience method to enable Claude 3.7 thinking process through adding inference parameters in additionalModelRequestFields.

Maven Dependency

<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>apache-client</artifactId>
<version>2.31.6</version>
</dependency>

Examples

StreamingChatLanguageModel using ConverseAPI

Guardrails is not supported by the current implementation.

Supported models and their features can be found here.

Models ids can be found here.

Configuration

StreamingChatLanguageModel model = BedrockStreamingChatModel.builder()
.modelId("us.amazon.nova-lite-v1:0")
.region(...)
.maxRetries(...)
.timeout(...)
.logRequests(...)
.logResponses(...)
.listeners(...)
.defaultRequestParameters(BedrockChatRequestParameters.builder()
.topP(...)
.temperature(...)
.maxOutputTokens(...)
.stopSequences(...)
.toolSpecifications(...)
.additionalModelRequestFields(...)
.build())
.build();

The field additionalModelRequestFields is a Map<String, Object>. As explained here it allows to add inference parameters for a specific model that is not covered by common inferenceConfig. BedrockChatRequestParameters has a convenience method to enable Claude 3.7 thinking process through adding inference parameters in additionalModelRequestFields.

Maven Dependency

<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>netty-nio-client</artifactId>
<version>2.31.6</version>
</dependency>

Examples

ChatLanguageModel using InvokeAPI

AI21 Models

  • BedrockAI21LabsChatModel (deprecated, please use BedrockChatModel)

Anthropic Models

  • BedrockAnthropicMessageChatModel: (deprecated, please use BedrockChatModel) supports new Messages API
  • BedrockAnthropicCompletionChatModel: (deprecated, please use BedrockChatModel) supports old Text Completions API
  • BedrockAnthropicStreamingChatModel

Example:

ChatLanguageModel model = BedrockAnthropicMessageChatModel.builder()
.model("anthropic.claude-3-sonnet-20240229-v1:0")
.build();

Cohere Models

  • BedrockCohereChatModel (deprecated, please use BedrockChatModel)

Meta Llama Models

  • BedrockLlamaChatModel (deprecated, please use BedrockChatModel)

Mistral Models

  • BedrockMistralAiChatModel (deprecated, please use BedrockChatModel)

Titan Models

  • BedrockTitanChatModel (deprecated, please use BedrockChatModel)
  • BedrockTitanEmbeddingModel

Examples