Class OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder
java.lang.Object
dev.langchain4j.model.openai.OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder
- Enclosing class:
OpenAiStreamingChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()customHeaders(Map<String, String> customHeaders) Sets custom HTTP headerscustomParameters(Map<String, Object> customParameters) Sets custom HTTP body parameterscustomQueryParams(Map<String, String> customQueryParams) Sets custom URL query parametersdefaultRequestParameters(ChatRequestParameters parameters) Sets default commonChatRequestParametersor OpenAI-specificOpenAiChatRequestParameters.frequencyPenalty(Double frequencyPenalty) httpClientBuilder(HttpClientBuilder httpClientBuilder) listeners(List<ChatModelListener> listeners) logger(org.slf4j.Logger logger) logRequests(Boolean logRequests) logResponses(Boolean logResponses) maxCompletionTokens(Integer maxCompletionTokens) modelName(OpenAiChatModelName modelName) organizationId(String organizationId) parallelToolCalls(Boolean parallelToolCalls) presencePenalty(Double presencePenalty) reasoningEffort(String reasoningEffort) responseFormat(ResponseFormat responseFormat) responseFormat(String responseFormat) returnThinking(Boolean returnThinking) This setting is intended for DeepSeek.serviceTier(String serviceTier) strictJsonSchema(Boolean strictJsonSchema) strictTools(Boolean strictTools) temperature(Double temperature)
-
Constructor Details
-
OpenAiStreamingChatModelBuilder
public OpenAiStreamingChatModelBuilder()
-
-
Method Details
-
httpClientBuilder
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder httpClientBuilder(HttpClientBuilder httpClientBuilder) -
defaultRequestParameters
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder defaultRequestParameters(ChatRequestParameters parameters) Sets default commonChatRequestParametersor OpenAI-specificOpenAiChatRequestParameters.
When a parameter is set via an individual builder method (e.g.,modelName(String)), its value takes precedence over the same parameter set viaChatRequestParameters. -
modelName
-
modelName
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder modelName(OpenAiChatModelName modelName) -
baseUrl
-
apiKey
-
organizationId
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder organizationId(String organizationId) -
projectId
-
temperature
-
topP
-
stop
-
maxTokens
-
maxCompletionTokens
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder maxCompletionTokens(Integer maxCompletionTokens) -
presencePenalty
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder presencePenalty(Double presencePenalty) -
frequencyPenalty
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder frequencyPenalty(Double frequencyPenalty) -
logitBias
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder logitBias(Map<String, Integer> logitBias) -
responseFormat
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder responseFormat(ResponseFormat responseFormat) -
responseFormat
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder responseFormat(String responseFormat) - See Also:
-
strictJsonSchema
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder strictJsonSchema(Boolean strictJsonSchema) -
seed
-
user
-
strictTools
-
parallelToolCalls
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder parallelToolCalls(Boolean parallelToolCalls) -
store
-
metadata
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder metadata(Map<String, String> metadata) -
serviceTier
-
reasoningEffort
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder reasoningEffort(String reasoningEffort) -
returnThinking
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder returnThinking(Boolean returnThinking) This setting is intended for DeepSeek.Controls whether to return thinking/reasoning text (if available) inside
AiMessage.thinking()and whether to invoke theStreamingChatResponseHandler.onPartialThinking(PartialThinking)callback. Please note that this does not enable thinking/reasoning for the LLM; it only controls whether to parse thereasoning_contentfield from the API response and return it inside theAiMessage.Disabled by default. If enabled, the thinking text will be stored within the
AiMessageand may be persisted. -
timeout
-
logRequests
-
logResponses
-
logger
- Parameters:
logger- an alternateLoggerto be used instead of the default one provided by Langchain4J for logging requests and responses.- Returns:
this.
-
customHeaders
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder customHeaders(Map<String, String> customHeaders) Sets custom HTTP headers -
customQueryParams
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder customQueryParams(Map<String, String> customQueryParams) Sets custom URL query parameters -
customParameters
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder customParameters(Map<String, Object> customParameters) Sets custom HTTP body parameters -
listeners
public OpenAiStreamingChatModel.OpenAiStreamingChatModelBuilder listeners(List<ChatModelListener> listeners) -
build
-