Package dev.langchain4j.model.chat
Interface StreamingChatLanguageModel
- All Known Implementing Classes:
AbstractBedrockStreamingChatModel
,AnthropicStreamingChatModel
,AzureOpenAiStreamingChatModel
,BedrockAnthropicStreamingChatModel
,DisabledStreamingChatLanguageModel
,GitHubModelsStreamingChatModel
,GoogleAiGeminiStreamingChatModel
,JlamaStreamingChatModel
,LocalAiStreamingChatModel
,MistralAiStreamingChatModel
,OllamaStreamingChatModel
,OpenAiOfficialStreamingChatModel
,OpenAiStreamingChatModel
,VertexAiGeminiStreamingChatModel
public interface StreamingChatLanguageModel
Represents a language model that has a chat API and can stream a response one token at a time.
- See Also:
-
Method Summary
Modifier and TypeMethodDescriptiondefault void
chat
(ChatRequest chatRequest, StreamingChatResponseHandler handler) This is the main API to interact with the chat model.default void
chat
(String userMessage, StreamingChatResponseHandler handler) default void
chat
(List<ChatMessage> messages, StreamingChatResponseHandler handler) default ChatRequestParameters
default void
doChat
(ChatRequest chatRequest, StreamingChatResponseHandler handler) default List
<ChatModelListener> default ModelProvider
provider()
default Set
<Capability>
-
Method Details
-
chat
This is the main API to interact with the chat model.A temporary default implementation of this method is necessary until all
StreamingChatLanguageModel
implementations adopt it. It should be removed once that occurs.- Parameters:
chatRequest
- aChatRequest
, containing all the inputs to the LLMhandler
- aStreamingChatResponseHandler
that will handle streaming response from the LLM
-
defaultRequestParameters
-
listeners
-
provider
-
doChat
-
chat
-
chat
-
supportedCapabilities
-