Class GitHubModelsChatModel.Builder
java.lang.Object
dev.langchain4j.model.github.GitHubModelsChatModel.Builder
- Enclosing class:
GitHubModelsChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()chatCompletionsClient(com.azure.ai.inference.ChatCompletionsClient chatCompletionsClient) Sets the Azure AI Inference API client.customHeaders(Map<String, String> customHeaders) Sets the GitHub Models endpoint.frequencyPenalty(Double frequencyPenalty) gitHubToken(String gitHubToken) Sets the GitHub token to access GitHub Models.listeners(List<ChatModelListener> listeners) logRequestsAndResponses(Boolean logRequestsAndResponses) maxRetries(Integer maxRetries) modelName(GitHubModelsChatModelName modelName) Sets the model name in Azure AI Inference API.presencePenalty(Double presencePenalty) proxyOptions(com.azure.core.http.ProxyOptions proxyOptions) responseFormat(com.azure.ai.inference.models.ChatCompletionsResponseFormat responseFormat) serviceVersion(com.azure.ai.inference.ModelServiceVersion serviceVersion) Sets the Azure OpenAI API service version.temperature(Double temperature) userAgentSuffix(String userAgentSuffix)
-
Constructor Details
-
Builder
public Builder()
-
-
Method Details
-
endpoint
Sets the GitHub Models endpoint. The default endpoint will be used if this isn't set.- Parameters:
endpoint- The GitHub Models endpoint in the format: https://models.inference.ai.azure.com- Returns:
- builder
-
serviceVersion
public GitHubModelsChatModel.Builder serviceVersion(com.azure.ai.inference.ModelServiceVersion serviceVersion) Sets the Azure OpenAI API service version. If left blank, the latest service version will be used.- Parameters:
serviceVersion- The Azure OpenAI API service version in the format: 2023-05-15- Returns:
- builder
-
gitHubToken
Sets the GitHub token to access GitHub Models.- Parameters:
gitHubToken- The GitHub token.- Returns:
- builder
-
modelName
Sets the model name in Azure AI Inference API. This is a mandatory parameter.- Parameters:
modelName- The Model name.- Returns:
- builder
-
modelName
-
maxTokens
-
temperature
-
topP
-
stop
-
presencePenalty
-
frequencyPenalty
-
seed
-
responseFormat
public GitHubModelsChatModel.Builder responseFormat(com.azure.ai.inference.models.ChatCompletionsResponseFormat responseFormat) -
timeout
-
maxRetries
-
proxyOptions
-
logRequestsAndResponses
-
userAgentSuffix
-
chatCompletionsClient
public GitHubModelsChatModel.Builder chatCompletionsClient(com.azure.ai.inference.ChatCompletionsClient chatCompletionsClient) Sets the Azure AI Inference API client. This is an optional parameter, if you need more flexibility than the common parameters.- Parameters:
chatCompletionsClient- The Azure AI Inference API client.- Returns:
- builder
-
listeners
-
customHeaders
-
build
-