Class OpenAiOfficialStreamingChatModel
java.lang.Object
dev.langchain4j.model.openaiofficial.OpenAiOfficialStreamingChatModel
- All Implemented Interfaces:
StreamingChatModel
-
Nested Class Summary
Nested Classes -
Field Summary
FieldsModifier and TypeFieldDescriptionprotected com.openai.client.OpenAIClientAsyncprotected com.openai.client.OpenAIClientprotected OpenAiOfficialChatRequestParametersprotected List<ChatModelListener> protected Stringprotected ModelProviderprotected Stringprotected Booleanprotected Booleanprotected Set<Capability> protected TokenCountEstimator -
Constructor Summary
ConstructorsConstructorDescription -
Method Summary
Modifier and TypeMethodDescriptionbuilder()voiddoChat(ChatRequest chatRequest, StreamingChatResponseHandler handler) voidinit(String baseUrl, String apiKey, com.openai.credential.Credential credential, String azureDeploymentName, com.openai.azure.AzureOpenAIServiceVersion azureOpenAIServiceVersion, String organizationId, boolean isAzure, boolean isGitHubModels, ChatRequestParameters defaultRequestParameters, String modelName, Double temperature, Double topP, List<String> stop, Integer maxCompletionTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Boolean strictJsonSchema, Integer seed, String user, Boolean strictTools, Boolean parallelToolCalls, Boolean store, Map<String, String> metadata, String serviceTier, Duration timeout, Integer maxRetries, Proxy proxy, TokenCountEstimator tokenCountEstimator, Map<String, String> customHeaders, List<ChatModelListener> listeners, Set<Capability> capabilities, boolean isAsync) provider()Methods inherited from class Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface StreamingChatModel
chat, chat, chat, defaultRequestParameters, listeners, provider, supportedCapabilities
-
Field Details
-
client
protected com.openai.client.OpenAIClient client -
asyncClient
protected com.openai.client.OpenAIClientAsync asyncClient -
modelName
-
defaultRequestParameters
-
responseFormat
-
strictJsonSchema
-
strictTools
-
tokenCountEstimator
-
listeners
-
supportedCapabilities
-
modelProvider
-
-
Constructor Details
-
OpenAiOfficialStreamingChatModel
-
-
Method Details
-
doChat
- Specified by:
doChatin interfaceStreamingChatModel
-
builder
-
init
public void init(String baseUrl, String apiKey, com.openai.credential.Credential credential, String azureDeploymentName, com.openai.azure.AzureOpenAIServiceVersion azureOpenAIServiceVersion, String organizationId, boolean isAzure, boolean isGitHubModels, ChatRequestParameters defaultRequestParameters, String modelName, Double temperature, Double topP, List<String> stop, Integer maxCompletionTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Boolean strictJsonSchema, Integer seed, String user, Boolean strictTools, Boolean parallelToolCalls, Boolean store, Map<String, String> metadata, String serviceTier, Duration timeout, Integer maxRetries, Proxy proxy, TokenCountEstimator tokenCountEstimator, Map<String, String> customHeaders, List<ChatModelListener> listeners, Set<Capability> capabilities, boolean isAsync) -
defaultRequestParameters
-
supportedCapabilities
-
listeners
-
provider
-