Class OpenAiOfficialChatModel
java.lang.Object
dev.langchain4j.model.openaiofficial.OpenAiOfficialChatModel
- All Implemented Interfaces:
ChatModel
-
Nested Class Summary
Nested Classes -
Field Summary
FieldsModifier and TypeFieldDescriptionprotected com.openai.client.OpenAIClientAsyncprotected com.openai.client.OpenAIClientprotected OpenAiOfficialChatRequestParametersprotected List<ChatModelListener> protected dev.langchain4j.model.openaiofficial.InternalOpenAiOfficialHelper.ModelHostprotected Stringprotected Stringprotected Booleanprotected Booleanprotected Set<Capability> protected TokenCountEstimator -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuilder()doChat(ChatRequest chatRequest) voidinit(String baseUrl, String apiKey, com.openai.credential.Credential credential, String azureDeploymentName, com.openai.azure.AzureOpenAIServiceVersion azureOpenAIServiceVersion, String organizationId, boolean isAzure, boolean isGitHubModels, com.openai.client.OpenAIClient openAIClient, com.openai.client.OpenAIClientAsync openAIClientAsync, ChatRequestParameters defaultRequestParameters, String modelName, Double temperature, Double topP, List<String> stop, Integer maxCompletionTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Boolean strictJsonSchema, Integer seed, String user, Boolean strictTools, Boolean parallelToolCalls, Boolean store, Map<String, String> metadata, String serviceTier, Duration timeout, Integer maxRetries, Proxy proxy, TokenCountEstimator tokenCountEstimator, Map<String, String> customHeaders, List<ChatModelListener> listeners, Set<Capability> capabilities, boolean isAsync) provider()Methods inherited from class Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface ChatModel
chat, chat, chat, chat, defaultRequestParameters, listeners, provider, supportedCapabilities
-
Field Details
-
client
protected com.openai.client.OpenAIClient client -
asyncClient
protected com.openai.client.OpenAIClientAsync asyncClient -
modelHost
protected dev.langchain4j.model.openaiofficial.InternalOpenAiOfficialHelper.ModelHost modelHost -
modelName
-
defaultRequestParameters
-
responseFormat
-
strictJsonSchema
-
strictTools
-
tokenCountEstimator
-
listeners
-
supportedCapabilities
-
-
Constructor Details
-
OpenAiOfficialChatModel
-
-
Method Details
-
doChat
-
builder
-
init
public void init(String baseUrl, String apiKey, com.openai.credential.Credential credential, String azureDeploymentName, com.openai.azure.AzureOpenAIServiceVersion azureOpenAIServiceVersion, String organizationId, boolean isAzure, boolean isGitHubModels, com.openai.client.OpenAIClient openAIClient, com.openai.client.OpenAIClientAsync openAIClientAsync, ChatRequestParameters defaultRequestParameters, String modelName, Double temperature, Double topP, List<String> stop, Integer maxCompletionTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Boolean strictJsonSchema, Integer seed, String user, Boolean strictTools, Boolean parallelToolCalls, Boolean store, Map<String, String> metadata, String serviceTier, Duration timeout, Integer maxRetries, Proxy proxy, TokenCountEstimator tokenCountEstimator, Map<String, String> customHeaders, List<ChatModelListener> listeners, Set<Capability> capabilities, boolean isAsync) -
defaultRequestParameters
-
supportedCapabilities
-
listeners
-
provider
-