Package dev.langchain4j.model.mistralai
Class MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder
java.lang.Object
dev.langchain4j.model.mistralai.MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder
- Enclosing class:
MistralAiStreamingChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()
logRequests
(Boolean logRequests) logResponses
(Boolean logResponses) modelName
(MistralAiChatModelName modelName) randomSeed
(Integer randomSeed) responseFormat
(MistralAiResponseFormatType responseFormat) responseFormat
(String responseFormat) safePrompt
(Boolean safePrompt) temperature
(Double temperature) toString()
-
Constructor Details
-
MistralAiStreamingChatModelBuilder
public MistralAiStreamingChatModelBuilder()
-
-
Method Details
-
modelName
-
modelName
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder modelName(MistralAiChatModelName modelName) -
responseFormat
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder responseFormat(String responseFormat) -
responseFormat
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder responseFormat(MistralAiResponseFormatType responseFormat) -
baseUrl
- Parameters:
baseUrl
- the base URL of the Mistral AI API. It uses the default value if not specified- Returns:
this
.
-
apiKey
- Parameters:
apiKey
- the API key for authentication- Returns:
this
.
-
temperature
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder temperature(Double temperature) - Parameters:
temperature
- the temperature parameter for generating chat responses- Returns:
this
.
-
topP
- Parameters:
topP
- the top-p parameter for generating chat responses- Returns:
this
.
-
maxTokens
- Parameters:
maxTokens
- the maximum number of new tokens to generate in a chat response- Returns:
this
.
-
safePrompt
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder safePrompt(Boolean safePrompt) - Parameters:
safePrompt
- a flag indicating whether to use a safe prompt for generating chat responses- Returns:
this
.
-
randomSeed
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder randomSeed(Integer randomSeed) - Parameters:
randomSeed
- the random seed for generating chat responses (if not specified, a random number is used)- Returns:
this
.
-
logRequests
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder logRequests(Boolean logRequests) - Parameters:
logRequests
- a flag indicating whether to log raw HTTP requests- Returns:
this
.
-
logResponses
public MistralAiStreamingChatModel.MistralAiStreamingChatModelBuilder logResponses(Boolean logResponses) - Parameters:
logResponses
- a flag indicating whether to log raw HTTP responses- Returns:
this
.
-
timeout
- Returns:
this
.
-
build
-
toString
-