Package dev.langchain4j.model.ollama
Class OllamaChatModel
java.lang.Object
dev.langchain4j.model.ollama.OllamaChatModel
- All Implemented Interfaces:
ChatLanguageModel
-
Nested Class Summary
-
Constructor Summary
ConstructorDescriptionOllamaChatModel
(String baseUrl, String modelName, Double temperature, Integer topK, Double topP, Double repeatPenalty, Integer seed, Integer numPredict, Integer numCtx, List<String> stop, String format, ResponseFormat responseFormat, Duration timeout, Integer maxRetries, Map<String, String> customHeaders, Boolean logRequests, Boolean logResponses, List<ChatModelListener> listeners, Set<Capability> supportedCapabilities) -
Method Summary
Modifier and TypeMethodDescriptionbuilder()
chat
(ChatRequest request) This is the main API to interact with the chat model.generate
(List<ChatMessage> messages) Generates a response from the model based on a sequence of messages.generate
(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications) Generates a response from the model based on a list of messages and a list of tool specifications.Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface dev.langchain4j.model.chat.ChatLanguageModel
chat, defaultRequestParameters, generate, generate, generate
-
Constructor Details
-
OllamaChatModel
public OllamaChatModel(String baseUrl, String modelName, Double temperature, Integer topK, Double topP, Double repeatPenalty, Integer seed, Integer numPredict, Integer numCtx, List<String> stop, String format, ResponseFormat responseFormat, Duration timeout, Integer maxRetries, Map<String, String> customHeaders, Boolean logRequests, Boolean logResponses, List<ChatModelListener> listeners, Set<Capability> supportedCapabilities)
-
-
Method Details
-
builder
-
generate
Description copied from interface:ChatLanguageModel
Generates a response from the model based on a sequence of messages. Typically, the sequence contains messages in the following order: System (optional) - User - AI - User - AI - User ...- Specified by:
generate
in interfaceChatLanguageModel
- Parameters:
messages
- A list of messages.- Returns:
- The response generated by the model.
-
generate
public Response<AiMessage> generate(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications) Description copied from interface:ChatLanguageModel
Generates a response from the model based on a list of messages and a list of tool specifications. The response may either be a text message or a request to execute one of the specified tools. Typically, the list contains messages in the following order: System (optional) - User - AI - User - AI - User ...- Specified by:
generate
in interfaceChatLanguageModel
- Parameters:
messages
- A list of messages.toolSpecifications
- A list of tools that the model is allowed to execute. The model autonomously decides whether to use any of these tools.- Returns:
- The response generated by the model.
AiMessage
can contain either a textual response or a request to execute one of the tools.
-
chat
Description copied from interface:ChatLanguageModel
This is the main API to interact with the chat model. All the existing generate(...) methods (see below) will be deprecated and removed before 1.0.0 release.A temporary default implementation of this method is necessary until all
ChatLanguageModel
implementations adopt it. It should be removed once that occurs.- Specified by:
chat
in interfaceChatLanguageModel
- Parameters:
request
- aChatRequest
, containing all the inputs to the LLM- Returns:
- a
ChatResponse
, containing all the outputs from the LLM
-
supportedCapabilities
- Specified by:
supportedCapabilities
in interfaceChatLanguageModel
-