Package dev.langchain4j.model.ollama
Class OllamaChatModel
java.lang.Object
dev.langchain4j.model.ollama.OllamaChatModel
- All Implemented Interfaces:
ChatLanguageModel
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionOllamaChatModel
(HttpClientBuilder httpClientBuilder, String baseUrl, String modelName, Double temperature, Integer topK, Double topP, Double repeatPenalty, Integer seed, Integer numPredict, Integer numCtx, List<String> stop, String format, ResponseFormat responseFormat, Duration timeout, Integer maxRetries, Map<String, String> customHeaders, Boolean logRequests, Boolean logResponses, List<ChatModelListener> listeners, Set<Capability> supportedCapabilities) -
Method Summary
Modifier and TypeMethodDescriptionbuilder()
chat
(ChatRequest request) This is the main API to interact with the chat model.provider()
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface dev.langchain4j.model.chat.ChatLanguageModel
chat, chat, chat, defaultRequestParameters, doChat
-
Constructor Details
-
OllamaChatModel
public OllamaChatModel(HttpClientBuilder httpClientBuilder, String baseUrl, String modelName, Double temperature, Integer topK, Double topP, Double repeatPenalty, Integer seed, Integer numPredict, Integer numCtx, List<String> stop, String format, ResponseFormat responseFormat, Duration timeout, Integer maxRetries, Map<String, String> customHeaders, Boolean logRequests, Boolean logResponses, List<ChatModelListener> listeners, Set<Capability> supportedCapabilities)
-
-
Method Details
-
builder
-
chat
Description copied from interface:ChatLanguageModel
This is the main API to interact with the chat model. A temporary default implementation of this method is necessary until allChatLanguageModel
implementations adopt it. It should be removed once that occurs.- Specified by:
chat
in interfaceChatLanguageModel
- Parameters:
request
- aChatRequest
, containing all the inputs to the LLM- Returns:
- a
ChatResponse
, containing all the outputs from the LLM
-
supportedCapabilities
- Specified by:
supportedCapabilities
in interfaceChatLanguageModel
-
listeners
- Specified by:
listeners
in interfaceChatLanguageModel
-
provider
- Specified by:
provider
in interfaceChatLanguageModel
-