Package dev.langchain4j.model.googleai
Class GoogleAiGeminiStreamingChatModel
java.lang.Object
dev.langchain4j.model.googleai.GoogleAiGeminiStreamingChatModel
- All Implemented Interfaces:
StreamingChatLanguageModel
-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic class
-
Field Summary
FieldsModifier and TypeFieldDescriptionprotected final boolean
protected final String
protected final dev.langchain4j.model.googleai.GeminiService
protected final boolean
protected final List
<ChatModelListener> protected final Integer
protected final Integer
protected final String
protected final ResponseFormat
protected final List
<GeminiSafetySetting> protected final Double
protected final GeminiFunctionCallingConfig
protected final Integer
protected final Double
-
Constructor Summary
ConstructorsConstructorDescriptionGoogleAiGeminiStreamingChatModel
(String apiKey, String modelName, Double temperature, Integer topK, Double topP, Integer maxOutputTokens, Duration timeout, ResponseFormat responseFormat, List<String> stopSequences, GeminiFunctionCallingConfig toolConfig, Boolean allowCodeExecution, Boolean includeCodeExecutionOutput, Boolean logRequestsAndResponses, List<GeminiSafetySetting> safetySettings, List<ChatModelListener> listeners, Integer maxRetries) -
Method Summary
Modifier and TypeMethodDescriptionvoid
chat
(ChatRequest chatRequest, StreamingChatResponseHandler handler) This is the main API to interact with the chat model.protected static String
computeMimeType
(ResponseFormat responseFormat) protected ChatModelRequest
createChatModelRequest
(String modelName, List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, ChatRequestParameters requestParameters) protected dev.langchain4j.model.googleai.GeminiGenerateContentRequest
createGenerateContentRequest
(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, ResponseFormat responseFormat, ChatRequestParameters requestParameters) protected void
notifyListenersOnError
(Exception exception, ChatModelRequest request, ModelProvider modelProvider, Map<Object, Object> attributes) protected void
protected void
notifyListenersOnResponse
(Response<AiMessage> response, ChatModelRequest request, ModelProvider modelProvider, Map<Object, Object> attributes) provider()
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface dev.langchain4j.model.chat.StreamingChatLanguageModel
chat, chat, defaultRequestParameters, doChat, supportedCapabilities
-
Field Details
-
geminiService
protected final dev.langchain4j.model.googleai.GeminiService geminiService -
apiKey
-
modelName
-
temperature
-
topK
-
topP
-
maxOutputTokens
-
stopSequences
-
responseFormat
-
toolConfig
-
allowCodeExecution
protected final boolean allowCodeExecution -
includeCodeExecutionOutput
protected final boolean includeCodeExecutionOutput -
safetySettings
-
listeners
-
maxRetries
-
-
Constructor Details
-
GoogleAiGeminiStreamingChatModel
public GoogleAiGeminiStreamingChatModel(String apiKey, String modelName, Double temperature, Integer topK, Double topP, Integer maxOutputTokens, Duration timeout, ResponseFormat responseFormat, List<String> stopSequences, GeminiFunctionCallingConfig toolConfig, Boolean allowCodeExecution, Boolean includeCodeExecutionOutput, Boolean logRequestsAndResponses, List<GeminiSafetySetting> safetySettings, List<ChatModelListener> listeners, Integer maxRetries)
-
-
Method Details
-
chat
Description copied from interface:StreamingChatLanguageModel
This is the main API to interact with the chat model.A temporary default implementation of this method is necessary until all
StreamingChatLanguageModel
implementations adopt it. It should be removed once that occurs.- Specified by:
chat
in interfaceStreamingChatLanguageModel
- Parameters:
chatRequest
- aChatRequest
, containing all the inputs to the LLMhandler
- aStreamingChatResponseHandler
that will handle streaming response from the LLM
-
listeners
- Specified by:
listeners
in interfaceStreamingChatLanguageModel
-
provider
- Specified by:
provider
in interfaceStreamingChatLanguageModel
-
createGenerateContentRequest
protected dev.langchain4j.model.googleai.GeminiGenerateContentRequest createGenerateContentRequest(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, ResponseFormat responseFormat, ChatRequestParameters requestParameters) -
createChatModelRequest
protected ChatModelRequest createChatModelRequest(String modelName, List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, ChatRequestParameters requestParameters) -
computeMimeType
-
notifyListenersOnRequest
-
notifyListenersOnResponse
protected void notifyListenersOnResponse(Response<AiMessage> response, ChatModelRequest request, ModelProvider modelProvider, Map<Object, Object> attributes) -
notifyListenersOnError
protected void notifyListenersOnError(Exception exception, ChatModelRequest request, ModelProvider modelProvider, Map<Object, Object> attributes)
-