Interface ChatLanguageModel

All Known Implementing Classes:
AbstractBedrockChatModel, AnthropicChatModel, AzureOpenAiChatModel, BedrockAI21LabsChatModel, BedrockAnthropicCompletionChatModel, BedrockAnthropicMessageChatModel, BedrockChatModel, BedrockCohereChatModel, BedrockLlamaChatModel, BedrockMistralAiChatModel, BedrockStabilityAIChatModel, BedrockTitanChatModel, DisabledChatLanguageModel, GitHubModelsChatModel, GoogleAiGeminiChatModel, HuggingFaceChatModel, JlamaChatModel, LocalAiChatModel, MistralAiChatModel, OllamaChatModel, OpenAiChatModel, VertexAiChatModel, VertexAiGeminiChatModel, WorkersAiChatModel

public interface ChatLanguageModel
Represents a language model that has a chat API.
See Also:
  • Method Details

    • chat

      default ChatResponse chat(ChatRequest chatRequest)
      This is the main API to interact with the chat model. A temporary default implementation of this method is necessary until all ChatLanguageModel implementations adopt it. It should be removed once that occurs.
      Parameters:
      chatRequest - a ChatRequest, containing all the inputs to the LLM
      Returns:
      a ChatResponse, containing all the outputs from the LLM
    • chat

      default String chat(String userMessage)
    • chat

      default ChatResponse chat(ChatMessage... messages)
    • chat

      default ChatResponse chat(List<ChatMessage> messages)
    • listeners

      default List<ChatModelListener> listeners()
    • doChat

      default ChatResponse doChat(ChatRequest chatRequest)
    • validate

      static void validate(ChatRequestParameters parameters)
    • validate

      static void validate(ToolChoice toolChoice)
    • validate

      static void validate(ResponseFormat responseFormat)
    • defaultRequestParameters

      default ChatRequestParameters defaultRequestParameters()
    • supportedCapabilities

      default Set<Capability> supportedCapabilities()
    • generate

      @Deprecated(forRemoval=true) default String generate(String userMessage)
      Deprecated, for removal: This API element is subject to removal in a future version.
      please use chat(String) instead
      Generates a response from the model based on a message from a user. This is a convenience method that receives the message from a user as a String and returns only the generated response.
      Parameters:
      userMessage - The message from the user.
      Returns:
      The response generated by the model.
    • generate

      @Deprecated(forRemoval=true) default Response<AiMessage> generate(ChatMessage... messages)
      Deprecated, for removal: This API element is subject to removal in a future version.
      please use chat(ChatMessage...) instead
      Generates a response from the model based on a sequence of messages. Typically, the sequence contains messages in the following order: System (optional) - User - AI - User - AI - User ...
      Parameters:
      messages - An array of messages.
      Returns:
      The response generated by the model.
    • generate

      Deprecated, for removal: This API element is subject to removal in a future version.
      please use chat(List) instead
      Generates a response from the model based on a sequence of messages. Typically, the sequence contains messages in the following order: System (optional) - User - AI - User - AI - User ...
      Parameters:
      messages - A list of messages.
      Returns:
      The response generated by the model.
    • generate

      @Deprecated(forRemoval=true) default Response<AiMessage> generate(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications)
      Deprecated, for removal: This API element is subject to removal in a future version.
      Generates a response from the model based on a list of messages and a list of tool specifications. The response may either be a text message or a request to execute one of the specified tools. Typically, the list contains messages in the following order: System (optional) - User - AI - User - AI - User ...
      Parameters:
      messages - A list of messages.
      toolSpecifications - A list of tools that the model is allowed to execute. The model autonomously decides whether to use any of these tools.
      Returns:
      The response generated by the model. AiMessage can contain either a textual response or a request to execute one of the tools.
      Throws:
      UnsupportedFeatureException - if tools are not supported by the underlying LLM API
    • generate

      @Deprecated(forRemoval=true) default Response<AiMessage> generate(List<ChatMessage> messages, ToolSpecification toolSpecification)
      Deprecated, for removal: This API element is subject to removal in a future version.
      Generates a response from the model based on a list of messages and a single tool specification. The model is forced to execute the specified tool. This is usually achieved by setting `tool_choice=ANY` in the LLM provider API.
      Typically, the list contains messages in the following order: System (optional) - User - AI - User - AI - User ...
      Parameters:
      messages - A list of messages.
      toolSpecification - The specification of a tool that must be executed. The model is forced to execute this tool.
      Returns:
      The response generated by the model. AiMessage contains a request to execute the specified tool.
      Throws:
      UnsupportedFeatureException - if tools are not supported by the underlying LLM API