Package dev.langchain4j.model.chat
Interface StreamingChatLanguageModel
- All Known Implementing Classes:
AbstractBedrockStreamingChatModel
,AnthropicStreamingChatModel
,AzureOpenAiStreamingChatModel
,BedrockAnthropicStreamingChatModel
,DisabledStreamingChatLanguageModel
,GitHubModelsStreamingChatModel
,GoogleAiGeminiStreamingChatModel
,JlamaStreamingChatModel
,LocalAiStreamingChatModel
,MistralAiStreamingChatModel
,OllamaStreamingChatModel
,OpenAiStreamingChatModel
,VertexAiGeminiStreamingChatModel
public interface StreamingChatLanguageModel
Represents a language model that has a chat API and can stream a response one token at a time.
- See Also:
-
Method Summary
Modifier and TypeMethodDescriptiondefault void
chat
(ChatRequest chatRequest, StreamingChatResponseHandler handler) This is the main API to interact with the chat model.default void
chat
(String userMessage, StreamingChatResponseHandler handler) default void
chat
(List<ChatMessage> messages, StreamingChatResponseHandler handler) default ChatRequestParameters
default void
doChat
(ChatRequest chatRequest, StreamingChatResponseHandler handler) default void
generate
(UserMessage userMessage, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.default void
generate
(String userMessage, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(String, StreamingChatResponseHandler)
insteaddefault void
generate
(List<ChatMessage> messages, ToolSpecification toolSpecification, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(ChatRequest, StreamingChatResponseHandler)
instead.void
generate
(List<ChatMessage> messages, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(List, StreamingChatResponseHandler)
insteaddefault void
generate
(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(ChatRequest, StreamingChatResponseHandler)
instead.default List
<ChatModelListener> default Set
<Capability>
-
Method Details
-
chat
This is the main API to interact with the chat model.A temporary default implementation of this method is necessary until all
StreamingChatLanguageModel
implementations adopt it. It should be removed once that occurs.- Parameters:
chatRequest
- aChatRequest
, containing all the inputs to the LLMhandler
- aStreamingChatResponseHandler
that will handle streaming response from the LLM
-
chat
-
chat
-
listeners
-
doChat
-
defaultRequestParameters
-
supportedCapabilities
-
generate
@Deprecated(forRemoval=true) default void generate(String userMessage, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(String, StreamingChatResponseHandler)
insteadGenerates a response from the model based on a message from a user.- Parameters:
userMessage
- The message from the user.handler
- The handler for streaming the response.
-
generate
@Deprecated(forRemoval=true) default void generate(UserMessage userMessage, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(List, StreamingChatResponseHandler)
insteadGenerates a response from the model based on a message from a user.- Parameters:
userMessage
- The message from the user.handler
- The handler for streaming the response.
-
generate
@Deprecated(forRemoval=true) void generate(List<ChatMessage> messages, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(List, StreamingChatResponseHandler)
insteadGenerates a response from the model based on a sequence of messages. Typically, the sequence contains messages in the following order: System (optional) - User - AI - User - AI - User ...- Parameters:
messages
- A list of messages.handler
- The handler for streaming the response.
-
generate
@Deprecated(forRemoval=true) default void generate(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(ChatRequest, StreamingChatResponseHandler)
instead. SeeChatRequestParameters.toolSpecifications()
.Generates a response from the model based on a list of messages and a list of tool specifications. The response may either be a text message or a request to execute one of the specified tools. Typically, the list contains messages in the following order: System (optional) - User - AI - User - AI - User ...- Parameters:
messages
- A list of messages.toolSpecifications
- A list of tools that the model is allowed to execute. The model autonomously decides whether to use any of these tools.handler
- The handler for streaming the response.AiMessage
can contain either a textual response or a request to execute one of the tools.- Throws:
UnsupportedFeatureException
- if tools are not supported by the underlying LLM API
-
generate
@Deprecated(forRemoval=true) default void generate(List<ChatMessage> messages, ToolSpecification toolSpecification, StreamingResponseHandler<AiMessage> handler) Deprecated, for removal: This API element is subject to removal in a future version.please usechat(ChatRequest, StreamingChatResponseHandler)
instead. SeeChatRequestParameters.toolSpecifications()
andChatRequestParameters.toolChoice()
.Generates a response from the model based on a list of messages and a single tool specification. The model is forced to execute the specified tool. This is usually achieved by setting `tool_choice=ANY` in the LLM provider API.- Parameters:
messages
- A list of messages.toolSpecification
- The specification of a tool that must be executed. The model is forced to execute this tool.handler
- The handler for streaming the response.- Throws:
UnsupportedFeatureException
- if tools are not supported by the underlying LLM API
-
chat(List, StreamingChatResponseHandler)
instead