Interface StreamingChatModel
- All Known Implementing Classes:
AnthropicStreamingChatModel, AzureOpenAiStreamingChatModel, BedrockStreamingChatModel, DisabledStreamingChatModel, GitHubModelsStreamingChatModel, GoogleAiGeminiStreamingChatModel, GPULlama3StreamingChatModel, JlamaStreamingChatModel, LocalAiStreamingChatModel, MistralAiStreamingChatModel, OllamaStreamingChatModel, OpenAiOfficialResponsesStreamingChatModel, OpenAiOfficialStreamingChatModel, OpenAiResponsesStreamingChatModel, OpenAiStreamingChatModel, VertexAiAnthropicStreamingChatModel, VertexAiGeminiStreamingChatModel, WatsonxStreamingChatModel
public interface StreamingChatModel
Represents a language model that has a chat API and can stream a response one token at a time.
- See Also:
-
Method Summary
Modifier and TypeMethodDescriptiondefault voidchat(ChatRequest request, ChatRequestOptions options, StreamingChatResponseHandler handler) Sends a streaming chat request with additional invocation options.default voidchat(ChatRequest request, StreamingChatResponseHandler handler) This is the main API to interact with the chat model.default voidchat(String userMessage, StreamingChatResponseHandler handler) default voidchat(List<ChatMessage> messages, StreamingChatResponseHandler handler) default ChatRequestParametersdefault voiddoChat(ChatRequest chatRequest, StreamingChatResponseHandler handler) default List<ChatModelListener> default ModelProviderprovider()default Set<Capability>
-
Method Details
-
chat
This is the main API to interact with the chat model.- Parameters:
request- aChatRequest, containing all the inputs to the LLMhandler- aStreamingChatResponseHandlerthat will handle streaming response from the LLM
-
chat
default void chat(ChatRequest request, ChatRequestOptions options, StreamingChatResponseHandler handler) Sends a streaming chat request with additional invocation options.- Parameters:
request- aChatRequest, containing all the inputs to the LLMoptions- aChatRequestOptionscarrying listener attributes and other per-call metadatahandler- aStreamingChatResponseHandlerthat will handle streaming response from the LLM- Since:
- 1.13.0
-
doChat
-
defaultRequestParameters
-
listeners
-
provider
-
chat
-
chat
-
supportedCapabilities
-