Package dev.langchain4j.model.openai
Class OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder
java.lang.Object
dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder
- Enclosing class:
OpenAiStreamingChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()
customHeaders
(Map<String, String> customHeaders) defaultRequestParameters
(ChatRequestParameters parameters) Sets default commonChatRequestParameters
or OpenAI-specificOpenAiChatRequestParameters
.frequencyPenalty
(Double frequencyPenalty) httpClientBuilder
(HttpClientBuilder httpClientBuilder) listeners
(List<ChatModelListener> listeners) logRequests
(Boolean logRequests) logResponses
(Boolean logResponses) maxCompletionTokens
(Integer maxCompletionTokens) modelName
(OpenAiChatModelName modelName) organizationId
(String organizationId) parallelToolCalls
(Boolean parallelToolCalls) presencePenalty
(Double presencePenalty) responseFormat
(String responseFormat) returnThinking
(Boolean returnThinking) This setting is intended for DeepSeek.serviceTier
(String serviceTier) strictJsonSchema
(Boolean strictJsonSchema) strictTools
(Boolean strictTools) temperature
(Double temperature)
-
Constructor Details
-
OpenAiStreamingChatModelBuilder
public OpenAiStreamingChatModelBuilder()
-
-
Method Details
-
httpClientBuilder
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder httpClientBuilder(HttpClientBuilder httpClientBuilder) -
defaultRequestParameters
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder defaultRequestParameters(ChatRequestParameters parameters) Sets default commonChatRequestParameters
or OpenAI-specificOpenAiChatRequestParameters
.
When a parameter is set via an individual builder method (e.g.,modelName(String)
), its value takes precedence over the same parameter set viaChatRequestParameters
. -
modelName
-
modelName
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder modelName(OpenAiChatModelName modelName) -
baseUrl
-
apiKey
-
organizationId
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder organizationId(String organizationId) -
projectId
-
temperature
-
topP
-
stop
-
maxTokens
-
maxCompletionTokens
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder maxCompletionTokens(Integer maxCompletionTokens) -
presencePenalty
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder presencePenalty(Double presencePenalty) -
frequencyPenalty
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder frequencyPenalty(Double frequencyPenalty) -
logitBias
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder logitBias(Map<String, Integer> logitBias) -
responseFormat
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder responseFormat(String responseFormat) -
strictJsonSchema
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder strictJsonSchema(Boolean strictJsonSchema) -
seed
-
user
-
strictTools
-
parallelToolCalls
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder parallelToolCalls(Boolean parallelToolCalls) -
store
-
metadata
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder metadata(Map<String, String> metadata) -
serviceTier
-
returnThinking
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder returnThinking(Boolean returnThinking) This setting is intended for DeepSeek.Controls whether to return thinking/reasoning text (if available) inside
AiMessage.thinking()
and whether to invoke theStreamingChatResponseHandler.onPartialThinking(PartialThinking)
callback. Please note that this does not enable thinking/reasoning for the LLM; it only controls whether to parse thereasoning_content
field from the API response and return it inside theAiMessage
.Disabled by default. If enabled, the thinking text will be stored within the
AiMessage
and may be persisted. -
timeout
-
logRequests
-
logResponses
-
customHeaders
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder customHeaders(Map<String, String> customHeaders) -
listeners
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder listeners(List<ChatModelListener> listeners) -
build
-