Package dev.langchain4j.model.ollama
Class OllamaChatModel
java.lang.Object
dev.langchain4j.model.ollama.OllamaChatModel
- All Implemented Interfaces:
ChatModel
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionOllamaChatModel
(HttpClientBuilder httpClientBuilder, String baseUrl, String modelName, Double temperature, Integer topK, Double topP, Double repeatPenalty, Integer seed, Integer numPredict, Integer numCtx, List<String> stop, String format, ResponseFormat responseFormat, Duration timeout, Integer maxRetries, Map<String, String> customHeaders, Boolean logRequests, Boolean logResponses, List<ChatModelListener> listeners, Set<Capability> supportedCapabilities) -
Method Summary
Modifier and TypeMethodDescriptionbuilder()
chat
(ChatRequest request) This is the main API to interact with the chat model.provider()
-
Constructor Details
-
OllamaChatModel
public OllamaChatModel(HttpClientBuilder httpClientBuilder, String baseUrl, String modelName, Double temperature, Integer topK, Double topP, Double repeatPenalty, Integer seed, Integer numPredict, Integer numCtx, List<String> stop, String format, ResponseFormat responseFormat, Duration timeout, Integer maxRetries, Map<String, String> customHeaders, Boolean logRequests, Boolean logResponses, List<ChatModelListener> listeners, Set<Capability> supportedCapabilities)
-
-
Method Details
-
builder
-
chat
Description copied from interface:ChatModel
This is the main API to interact with the chat model.- Specified by:
chat
in interfaceChatModel
- Parameters:
request
- aChatRequest
, containing all the inputs to the LLM- Returns:
- a
ChatResponse
, containing all the outputs from the LLM
-
supportedCapabilities
- Specified by:
supportedCapabilities
in interfaceChatModel
-
listeners
-
provider
-