Class MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder
java.lang.Object
dev.langchain4j.model.mistralai.MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder
- Enclosing class:
MistralAiStreamingChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()defaultRequestParameters(ChatRequestParameters parameters) frequencyPenalty(Double frequencyPenalty) httpClientBuilder(HttpClientBuilder httpClientBuilder) listeners(List<ChatModelListener> listeners) logger(org.slf4j.Logger logger) logRequests(Boolean logRequests) logResponses(Boolean logResponses) modelName(MistralAiChatModelName modelName) presencePenalty(Double presencePenalty) randomSeed(Integer randomSeed) responseFormat(ResponseFormat responseFormat) returnThinking(Boolean returnThinking) Controls whether to return thinking/reasoning text (if available) insideAiMessage.thinking().safePrompt(Boolean safePrompt) sendThinking(Boolean sendThinking) Controls whether to send thinking/reasoning text to the LLM in follow-up requests.stopSequences(List<String> stopSequences) supportedCapabilities(Capability... supportedCapabilities) supportedCapabilities(Set<Capability> supportedCapabilities) temperature(Double temperature)
-
Constructor Details
-
MistralAiStreamingChatModelBuilder
public MistralAiStreamingChatModelBuilder()
-
-
Method Details
-
httpClientBuilder
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder httpClientBuilder(HttpClientBuilder httpClientBuilder) -
modelName
-
modelName
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder modelName(MistralAiChatModelName modelName) -
responseFormat
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder responseFormat(ResponseFormat responseFormat) -
baseUrl
- Parameters:
baseUrl- the base URL of the Mistral AI API. It uses the default value if not specified- Returns:
this.
-
apiKey
- Parameters:
apiKey- the API key for authentication- Returns:
this.
-
temperature
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder temperature(Double temperature) - Parameters:
temperature- the temperature parameter for generating chat responses- Returns:
this.
-
topP
- Parameters:
topP- the top-p parameter for generating chat responses- Returns:
this.
-
maxTokens
- Parameters:
maxTokens- the maximum number of new tokens to generate in a chat response- Returns:
this.
-
safePrompt
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder safePrompt(Boolean safePrompt) - Parameters:
safePrompt- a flag indicating whether to use a safe prompt for generating chat responses- Returns:
this.
-
randomSeed
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder randomSeed(Integer randomSeed) - Parameters:
randomSeed- the random seed for generating chat responses- Returns:
this.
-
returnThinking
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder returnThinking(Boolean returnThinking) Controls whether to return thinking/reasoning text (if available) insideAiMessage.thinking(). Please note that this does not enable thinking/reasoning for the LLM; it only controls whether to parse thethinkingcontent from the API response and return it inside theAiMessage.Disabled by default. If enabled, the thinking text will be stored within the
AiMessageand may be persisted.- See Also:
-
sendThinking
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder sendThinking(Boolean sendThinking) Controls whether to send thinking/reasoning text to the LLM in follow-up requests.Disabled by default. If enabled, the contents of
AiMessage.thinking()will be sent in the API request.- See Also:
-
timeout
- Parameters:
timeout- the timeout duration for API requestsThe default value is 60 seconds
- Returns:
this.
-
logRequests
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder logRequests(Boolean logRequests) - Parameters:
logRequests- a flag indicating whether to log API requests- Returns:
this.
-
logResponses
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder logResponses(Boolean logResponses) - Parameters:
logResponses- a flag indicating whether to log API responses- Returns:
this.
-
logger
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder logger(org.slf4j.Logger logger) - Parameters:
logger- an alternateLoggerto be used instead of the default one provided by Langchain4J for logging requests and responses.- Returns:
this.
-
supportedCapabilities
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder supportedCapabilities(Capability... supportedCapabilities) -
supportedCapabilities
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder supportedCapabilities(Set<Capability> supportedCapabilities) -
stopSequences
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder stopSequences(List<String> stopSequences) -
presencePenalty
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder presencePenalty(Double presencePenalty) -
frequencyPenalty
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder frequencyPenalty(Double frequencyPenalty) -
listeners
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder listeners(List<ChatModelListener> listeners) -
defaultRequestParameters
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder defaultRequestParameters(ChatRequestParameters parameters) -
build
-