Package dev.langchain4j.model.anthropic
Class AnthropicStreamingChatModel
java.lang.Object
dev.langchain4j.model.anthropic.AnthropicStreamingChatModel
- All Implemented Interfaces:
StreamingChatModel
Represents an Anthropic language model with a Messages (chat) API.
The model's response is streamed token by token and should be handled with
More details are available here and here.
It supports
The content of
Sanitization is performed on the
Supports caching
StreamingResponseHandler
.
More details are available here and here.
It supports
Image
s as inputs. UserMessage
s can contain one or multiple ImageContent
s.
Image
s must not be represented as URLs; they should be Base64-encoded strings and include a mimeType
.
The content of
SystemMessage
s is sent using the "system" parameter.
Sanitization is performed on the
ChatMessage
s provided to ensure conformity with Anthropic API requirements.
This includes ensuring the first message is a UserMessage
and that there are no consecutive UserMessage
s.
Any messages removed during sanitization are logged as warnings and not submitted to the API.
Supports caching
SystemMessage
s and ToolSpecification
s.-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic class
-
Method Summary
Modifier and TypeMethodDescriptionbuilder()
void
chat
(ChatRequest chatRequest, StreamingChatResponseHandler handler) This is the main API to interact with the chat model.provider()
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface dev.langchain4j.model.chat.StreamingChatModel
chat, chat, defaultRequestParameters, doChat, supportedCapabilities
-
Method Details
-
builder
-
chat
Description copied from interface:StreamingChatModel
This is the main API to interact with the chat model.- Specified by:
chat
in interfaceStreamingChatModel
- Parameters:
chatRequest
- aChatRequest
, containing all the inputs to the LLMhandler
- aStreamingChatResponseHandler
that will handle streaming response from the LLM
-
listeners
- Specified by:
listeners
in interfaceStreamingChatModel
-
provider
- Specified by:
provider
in interfaceStreamingChatModel
-