Class OpenAiOfficialStreamingChatModel
java.lang.Object
dev.langchain4j.model.openaiofficial.OpenAiOfficialStreamingChatModel
- All Implemented Interfaces:
StreamingChatLanguageModel
-
Nested Class Summary
Nested Classes -
Field Summary
FieldsModifier and TypeFieldDescriptionprotected com.openai.client.OpenAIClientAsync
protected com.openai.client.OpenAIClient
protected OpenAiOfficialChatRequestParameters
protected List
<ChatModelListener> protected dev.langchain4j.model.openaiofficial.InternalOpenAiOfficialHelper.ModelHost
protected String
protected String
protected Boolean
protected Boolean
protected Set
<Capability> protected Tokenizer
-
Constructor Summary
ConstructorsConstructorDescription -
Method Summary
Modifier and TypeMethodDescriptionbuilder()
void
doChat
(ChatRequest chatRequest, StreamingChatResponseHandler handler) void
init
(String baseUrl, String apiKey, com.openai.credential.Credential credential, String azureDeploymentName, com.openai.azure.AzureOpenAIServiceVersion azureOpenAIServiceVersion, String organizationId, boolean isAzure, boolean isGitHubModels, com.openai.client.OpenAIClient openAIClient, com.openai.client.OpenAIClientAsync openAIClientAsync, ChatRequestParameters defaultRequestParameters, String modelName, Double temperature, Double topP, List<String> stop, Integer maxCompletionTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Boolean strictJsonSchema, Integer seed, String user, Boolean strictTools, Boolean parallelToolCalls, Boolean store, Map<String, String> metadata, String serviceTier, Duration timeout, Integer maxRetries, Proxy proxy, Tokenizer tokenizer, Map<String, String> customHeaders, List<ChatModelListener> listeners, Set<Capability> capabilities, boolean isAsync) provider()
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface dev.langchain4j.model.chat.StreamingChatLanguageModel
chat, chat, chat, defaultRequestParameters, listeners, provider, supportedCapabilities
-
Field Details
-
client
protected com.openai.client.OpenAIClient client -
asyncClient
protected com.openai.client.OpenAIClientAsync asyncClient -
modelHost
protected dev.langchain4j.model.openaiofficial.InternalOpenAiOfficialHelper.ModelHost modelHost -
modelName
-
defaultRequestParameters
-
responseFormat
-
strictJsonSchema
-
strictTools
-
tokenizer
-
listeners
-
supportedCapabilities
-
-
Constructor Details
-
OpenAiOfficialStreamingChatModel
-
-
Method Details
-
doChat
- Specified by:
doChat
in interfaceStreamingChatLanguageModel
-
builder
-
init
public void init(String baseUrl, String apiKey, com.openai.credential.Credential credential, String azureDeploymentName, com.openai.azure.AzureOpenAIServiceVersion azureOpenAIServiceVersion, String organizationId, boolean isAzure, boolean isGitHubModels, com.openai.client.OpenAIClient openAIClient, com.openai.client.OpenAIClientAsync openAIClientAsync, ChatRequestParameters defaultRequestParameters, String modelName, Double temperature, Double topP, List<String> stop, Integer maxCompletionTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Boolean strictJsonSchema, Integer seed, String user, Boolean strictTools, Boolean parallelToolCalls, Boolean store, Map<String, String> metadata, String serviceTier, Duration timeout, Integer maxRetries, Proxy proxy, Tokenizer tokenizer, Map<String, String> customHeaders, List<ChatModelListener> listeners, Set<Capability> capabilities, boolean isAsync) -
defaultRequestParameters
-
supportedCapabilities
-
listeners
-
provider
-