Package dev.langchain4j.model.openai
Class OpenAiChatModel.OpenAiChatModelBuilder
java.lang.Object
dev.langchain4j.model.openai.OpenAiChatModel.OpenAiChatModelBuilder
- Enclosing class:
OpenAiChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()
customHeaders
(Map<String, String> customHeaders) defaultRequestParameters
(ChatRequestParameters parameters) Sets default commonChatRequestParameters
or OpenAI-specificOpenAiChatRequestParameters
.frequencyPenalty
(Double frequencyPenalty) httpClientBuilder
(HttpClientBuilder httpClientBuilder) listeners
(List<ChatModelListener> listeners) logRequests
(Boolean logRequests) logResponses
(Boolean logResponses) maxCompletionTokens
(Integer maxCompletionTokens) maxRetries
(Integer maxRetries) modelName
(OpenAiChatModelName modelName) organizationId
(String organizationId) parallelToolCalls
(Boolean parallelToolCalls) presencePenalty
(Double presencePenalty) responseFormat
(String responseFormat) returnThinking
(Boolean returnThinking) This setting is intended for DeepSeek.serviceTier
(String serviceTier) strictJsonSchema
(Boolean strictJsonSchema) strictTools
(Boolean strictTools) supportedCapabilities
(Capability... supportedCapabilities) supportedCapabilities
(Set<Capability> supportedCapabilities) temperature
(Double temperature)
-
Constructor Details
-
OpenAiChatModelBuilder
public OpenAiChatModelBuilder()
-
-
Method Details
-
httpClientBuilder
public OpenAiChatModel.OpenAiChatModelBuilder httpClientBuilder(HttpClientBuilder httpClientBuilder) -
defaultRequestParameters
public OpenAiChatModel.OpenAiChatModelBuilder defaultRequestParameters(ChatRequestParameters parameters) Sets default commonChatRequestParameters
or OpenAI-specificOpenAiChatRequestParameters
.
When a parameter is set via an individual builder method (e.g.,modelName(String)
), its value takes precedence over the same parameter set viaChatRequestParameters
. -
modelName
-
modelName
-
baseUrl
-
apiKey
-
organizationId
-
projectId
-
temperature
-
topP
-
stop
-
maxTokens
-
maxCompletionTokens
-
presencePenalty
-
frequencyPenalty
-
logitBias
-
responseFormat
-
supportedCapabilities
public OpenAiChatModel.OpenAiChatModelBuilder supportedCapabilities(Set<Capability> supportedCapabilities) -
supportedCapabilities
public OpenAiChatModel.OpenAiChatModelBuilder supportedCapabilities(Capability... supportedCapabilities) -
strictJsonSchema
-
seed
-
user
-
strictTools
-
parallelToolCalls
-
store
-
metadata
-
serviceTier
-
returnThinking
This setting is intended for DeepSeek.Controls whether to return thinking/reasoning text (if available) inside
AiMessage.thinking()
. Please note that this does not enable thinking/reasoning for the LLM; it only controls whether to parse thereasoning_content
field from the API response and return it inside theAiMessage
.Disabled by default. If enabled, the thinking text will be stored within the
AiMessage
and may be persisted. -
timeout
-
maxRetries
-
logRequests
-
logResponses
-
customHeaders
-
listeners
-
build
-