Interface StreamingChatModel
- All Known Implementing Classes:
AnthropicStreamingChatModel, AzureOpenAiStreamingChatModel, BedrockStreamingChatModel, DisabledStreamingChatModel, GitHubModelsStreamingChatModel, GoogleAiGeminiStreamingChatModel, GPULlama3StreamingChatModel, JlamaStreamingChatModel, LocalAiStreamingChatModel, MistralAiStreamingChatModel, OllamaStreamingChatModel, OpenAiOfficialStreamingChatModel, OpenAiStreamingChatModel, VertexAiGeminiStreamingChatModel, WatsonxStreamingChatModel
public interface StreamingChatModel
Represents a language model that has a chat API and can stream a response one token at a time.
- See Also:
-
Method Summary
Modifier and TypeMethodDescriptiondefault voidchat(ChatRequest chatRequest, StreamingChatResponseHandler handler) This is the main API to interact with the chat model.default voidchat(String userMessage, StreamingChatResponseHandler handler) default voidchat(List<ChatMessage> messages, StreamingChatResponseHandler handler) default ChatRequestParametersdefault voiddoChat(ChatRequest chatRequest, StreamingChatResponseHandler handler) default List<ChatModelListener> default ModelProviderprovider()default Set<Capability>
-
Method Details
-
chat
This is the main API to interact with the chat model.- Parameters:
chatRequest- aChatRequest, containing all the inputs to the LLMhandler- aStreamingChatResponseHandlerthat will handle streaming response from the LLM
-
doChat
-
defaultRequestParameters
-
listeners
-
provider
-
chat
-
chat
-
supportedCapabilities
-