Package dev.langchain4j.model.anthropic
Class AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder
java.lang.Object
dev.langchain4j.model.anthropic.AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder
- Enclosing class:
AnthropicStreamingChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()
cacheSystemMessages
(Boolean cacheSystemMessages) cacheTools
(Boolean cacheTools) disableParallelToolUse
(Boolean disableParallelToolUse) httpClientBuilder
(HttpClientBuilder httpClientBuilder) listeners
(List<ChatModelListener> listeners) logger
(org.slf4j.Logger logger) logRequests
(Boolean logRequests) logResponses
(Boolean logResponses) modelName
(AnthropicChatModelName modelName) returnThinking
(Boolean returnThinking) Controls whether to return thinking/reasoning text (if available) insideAiMessage.thinking()
and whether to invoke theStreamingChatResponseHandler.onPartialThinking(PartialThinking)
callback.sendThinking
(Boolean sendThinking) Controls whether to send thinking/reasoning text to the LLM in follow-up requests.stopSequences
(List<String> stopSequences) temperature
(Double temperature) thinkingBudgetTokens
(Integer thinkingBudgetTokens) Configures thinking.thinkingType
(String thinkingType) Enables thinking.toolChoice
(ToolChoice toolChoice) toolChoiceName
(String toolChoiceName) toolSpecifications
(ToolSpecification... toolSpecifications) toolSpecifications
(List<ToolSpecification> toolSpecifications) Sets the user ID for the requests.
-
Constructor Details
-
AnthropicStreamingChatModelBuilder
public AnthropicStreamingChatModelBuilder()
-
-
Method Details
-
httpClientBuilder
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder httpClientBuilder(HttpClientBuilder httpClientBuilder) -
baseUrl
-
apiKey
-
version
-
beta
-
modelName
-
modelName
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder modelName(AnthropicChatModelName modelName) -
temperature
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder temperature(Double temperature) -
topP
-
topK
-
maxTokens
-
stopSequences
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder stopSequences(List<String> stopSequences) -
toolSpecifications
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder toolSpecifications(List<ToolSpecification> toolSpecifications) -
toolSpecifications
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder toolSpecifications(ToolSpecification... toolSpecifications) -
cacheSystemMessages
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder cacheSystemMessages(Boolean cacheSystemMessages) -
cacheTools
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder cacheTools(Boolean cacheTools) -
thinkingType
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder thinkingType(String thinkingType) Enables thinking. -
thinkingBudgetTokens
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder thinkingBudgetTokens(Integer thinkingBudgetTokens) Configures thinking. -
returnThinking
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder returnThinking(Boolean returnThinking) Controls whether to return thinking/reasoning text (if available) insideAiMessage.thinking()
and whether to invoke theStreamingChatResponseHandler.onPartialThinking(PartialThinking)
callback. Please note that this does not enable thinking/reasoning for the LLM; it only controls whether to parse thethinking
field from the API response and return it inside theAiMessage
.Disabled by default. If enabled, the thinking text will be stored within the
AiMessage
and may be persisted. If enabled, thinking signatures will also be stored and returned inside theAiMessage.attributes()
.- See Also:
-
sendThinking
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder sendThinking(Boolean sendThinking) Controls whether to send thinking/reasoning text to the LLM in follow-up requests.Enabled by default. If enabled, the contents of
AiMessage.thinking()
will be sent in the API request. If enabled, thinking signatures (inside theAiMessage.attributes()
) will also be sent.- See Also:
-
timeout
-
logRequests
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder logRequests(Boolean logRequests) -
logResponses
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder logResponses(Boolean logResponses) -
logger
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder logger(org.slf4j.Logger logger) - Parameters:
logger
- an alternateLogger
to be used instead of the default one provided by Langchain4J for logging requests and responses.- Returns:
this
.
-
listeners
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder listeners(List<ChatModelListener> listeners) -
toolChoice
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder toolChoice(ToolChoice toolChoice) -
toolChoiceName
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder toolChoiceName(String toolChoiceName) -
disableParallelToolUse
public AnthropicStreamingChatModel.AnthropicStreamingChatModelBuilder disableParallelToolUse(Boolean disableParallelToolUse) -
userId
Sets the user ID for the requests. This should be a uuid, hash value, or other opaque identifier. Anthropic may use this id to help detect abuse. Do not include any identifying information such as name, email address, or phone number.- Parameters:
userId
- the user identifier- Returns:
- this builder
-
build
-