Package dev.langchain4j.model.mistralai
Class MistralAiChatModel
java.lang.Object
dev.langchain4j.model.mistralai.MistralAiChatModel
- All Implemented Interfaces:
ChatModel
Represents a Mistral AI Chat Model with a chat completion interface, such as open-mistral-7b and open-mixtral-8x7b
This model allows generating chat completion of a sync way based on a list of chat messages.
You can find description of parameters
here.
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionMistralAiChatModel
(HttpClientBuilder httpClientBuilder, String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, ResponseFormat responseFormat, Duration timeout, Boolean logRequests, Boolean logResponses, Integer maxRetries, Set<Capability> supportedCapabilities) Deprecated, for removal: This API element is subject to removal in a future version.MistralAiChatModel
(String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, ResponseFormat responseFormat, Duration timeout, Boolean logRequests, Boolean logResponses, Integer maxRetries, Set<Capability> supportedCapabilities) Deprecated, for removal: This API element is subject to removal in a future version.Please usebuilder()
instead. -
Method Summary
-
Constructor Details
-
MistralAiChatModel
-
MistralAiChatModel
@Deprecated(forRemoval=true) public MistralAiChatModel(HttpClientBuilder httpClientBuilder, String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, ResponseFormat responseFormat, Duration timeout, Boolean logRequests, Boolean logResponses, Integer maxRetries, Set<Capability> supportedCapabilities) Deprecated, for removal: This API element is subject to removal in a future version.Please usebuilder()
instead.Constructs a MistralAiChatModel with the specified parameters.- Parameters:
httpClientBuilder
- the HTTP client builder to use for creating the HTTP clientbaseUrl
- the base URL of the Mistral AI API. It uses the default value if not specifiedapiKey
- the API key for authenticationmodelName
- the name of the Mistral AI model to usetemperature
- the temperature parameter for generating chat responsestopP
- the top-p parameter for generating chat responsesmaxTokens
- the maximum number of new tokens to generate in a chat responsesafePrompt
- a flag indicating whether to use a safe prompt for generating chat responsesrandomSeed
- the random seed for generating chat responsesresponseFormat
- the response format for generating chat responses.Current values supported are "text" and "json_object".
timeout
- the timeout duration for API requestsThe default value is 60 seconds
logRequests
- a flag indicating whether to log API requestslogResponses
- a flag indicating whether to log API responsesmaxRetries
- the maximum number of retries for API requests. It uses the default value 3 if not specifiedsupportedCapabilities
- the set of capabilities supported by this model
-
MistralAiChatModel
@Deprecated(forRemoval=true) public MistralAiChatModel(String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, ResponseFormat responseFormat, Duration timeout, Boolean logRequests, Boolean logResponses, Integer maxRetries, Set<Capability> supportedCapabilities) Deprecated, for removal: This API element is subject to removal in a future version.Please usebuilder()
instead.Constructs a MistralAiChatModel with the specified parameters.- Parameters:
baseUrl
- the base URL of the Mistral AI API. It uses the default value if not specifiedapiKey
- the API key for authenticationmodelName
- the name of the Mistral AI model to usetemperature
- the temperature parameter for generating chat responsestopP
- the top-p parameter for generating chat responsesmaxTokens
- the maximum number of new tokens to generate in a chat responsesafePrompt
- a flag indicating whether to use a safe prompt for generating chat responsesrandomSeed
- the random seed for generating chat responsesresponseFormat
- the response format for generating chat responses.Current values supported are "text" and "json_object".
timeout
- the timeout duration for API requestsThe default value is 60 seconds
logRequests
- a flag indicating whether to log API requestslogResponses
- a flag indicating whether to log API responsesmaxRetries
- the maximum number of retries for API requests. It uses the default value 3 if not specifiedsupportedCapabilities
- the set of capabilities supported by this model
-
-
Method Details
-
doChat
-
defaultRequestParameters
- Specified by:
defaultRequestParameters
in interfaceChatModel
-
listeners
-
provider
-
supportedCapabilities
- Specified by:
supportedCapabilities
in interfaceChatModel
-
builder
-
builder()
instead.