Package dev.langchain4j.model.github
Class GitHubModelsStreamingChatModel.Builder
java.lang.Object
dev.langchain4j.model.github.GitHubModelsStreamingChatModel.Builder
- Enclosing class:
GitHubModelsStreamingChatModel
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionbuild()
chatCompletionsAsyncClient
(com.azure.ai.inference.ChatCompletionsAsyncClient client) customHeaders
(Map<String, String> customHeaders) Sets the GitHub Models endpoint.frequencyPenalty
(Double frequencyPenalty) gitHubToken
(String gitHubToken) Sets the GitHub token to access GitHub Models.listeners
(List<ChatModelListener> listeners) logRequestsAndResponses
(boolean logRequestsAndResponses) maxRetries
(Integer maxRetries) modelName
(GitHubModelsChatModelName modelName) Sets the model name in Azure OpenAI.presencePenalty
(Double presencePenalty) proxyOptions
(com.azure.core.http.ProxyOptions proxyOptions) responseFormat
(com.azure.ai.inference.models.ChatCompletionsResponseFormat responseFormat) serviceVersion
(com.azure.ai.inference.ModelServiceVersion serviceVersion) Sets the Azure OpenAI API service version.temperature
(Double temperature) userAgentSuffix
(String userAgentSuffix)
-
Constructor Details
-
Builder
public Builder()
-
-
Method Details
-
endpoint
Sets the GitHub Models endpoint. The default endpoint will be used if this isn't set.- Parameters:
endpoint
- The GitHub Models endpoint in the format: https://models.inference.ai.azure.com- Returns:
- builder
-
serviceVersion
public GitHubModelsStreamingChatModel.Builder serviceVersion(com.azure.ai.inference.ModelServiceVersion serviceVersion) Sets the Azure OpenAI API service version. If left blank, the latest service version will be used.- Parameters:
serviceVersion
- The Azure OpenAI API service version in the format: 2023-05-15- Returns:
- builder
-
gitHubToken
Sets the GitHub token to access GitHub Models.- Parameters:
gitHubToken
- The GitHub token.- Returns:
- builder
-
modelName
Sets the model name in Azure OpenAI. This is a mandatory parameter.- Parameters:
modelName
- The Model name.- Returns:
- builder
-
modelName
-
maxTokens
-
temperature
-
topP
-
stop
-
presencePenalty
-
frequencyPenalty
-
seed
-
responseFormat
public GitHubModelsStreamingChatModel.Builder responseFormat(com.azure.ai.inference.models.ChatCompletionsResponseFormat responseFormat) -
timeout
-
maxRetries
-
proxyOptions
public GitHubModelsStreamingChatModel.Builder proxyOptions(com.azure.core.http.ProxyOptions proxyOptions) -
logRequestsAndResponses
public GitHubModelsStreamingChatModel.Builder logRequestsAndResponses(boolean logRequestsAndResponses) -
chatCompletionsAsyncClient
public GitHubModelsStreamingChatModel.Builder chatCompletionsAsyncClient(com.azure.ai.inference.ChatCompletionsAsyncClient client) -
userAgentSuffix
-
listeners
-
customHeaders
-
build
-