Class DisabledStreamingChatLanguageModel

java.lang.Object
dev.langchain4j.model.chat.DisabledStreamingChatLanguageModel
All Implemented Interfaces:
StreamingChatLanguageModel

public class DisabledStreamingChatLanguageModel extends Object implements StreamingChatLanguageModel
A StreamingChatLanguageModel which throws a ModelDisabledException for all of its methods

This could be used in tests, or in libraries that extend this one to conditionally enable or disable functionality.

  • Constructor Details

    • DisabledStreamingChatLanguageModel

      public DisabledStreamingChatLanguageModel()
  • Method Details

    • generate

      public void generate(String userMessage, StreamingResponseHandler<AiMessage> handler)
      Description copied from interface: StreamingChatLanguageModel
      Generates a response from the model based on a message from a user.
      Specified by:
      generate in interface StreamingChatLanguageModel
      Parameters:
      userMessage - The message from the user.
      handler - The handler for streaming the response.
    • generate

      public void generate(List<ChatMessage> messages, StreamingResponseHandler<AiMessage> handler)
      Description copied from interface: StreamingChatLanguageModel
      Generates a response from the model based on a sequence of messages. Typically, the sequence contains messages in the following order: System (optional) - User - AI - User - AI - User ...
      Specified by:
      generate in interface StreamingChatLanguageModel
      Parameters:
      messages - A list of messages.
      handler - The handler for streaming the response.
    • generate

      public void generate(List<ChatMessage> messages, List<ToolSpecification> toolSpecifications, StreamingResponseHandler<AiMessage> handler)
      Description copied from interface: StreamingChatLanguageModel
      Generates a response from the model based on a list of messages and a list of tool specifications. The response may either be a text message or a request to execute one of the specified tools. Typically, the list contains messages in the following order: System (optional) - User - AI - User - AI - User ...
      Specified by:
      generate in interface StreamingChatLanguageModel
      Parameters:
      messages - A list of messages.
      toolSpecifications - A list of tools that the model is allowed to execute. The model autonomously decides whether to use any of these tools.
      handler - The handler for streaming the response. AiMessage can contain either a textual response or a request to execute one of the tools.
    • generate

      public void generate(List<ChatMessage> messages, ToolSpecification toolSpecification, StreamingResponseHandler<AiMessage> handler)
      Description copied from interface: StreamingChatLanguageModel
      Generates a response from the model based on a list of messages and a single tool specification. The model is forced to execute the specified tool. This is usually achieved by setting `tool_choice=ANY` in the LLM provider API.
      Specified by:
      generate in interface StreamingChatLanguageModel
      Parameters:
      messages - A list of messages.
      toolSpecification - The specification of a tool that must be executed. The model is forced to execute this tool.
      handler - The handler for streaming the response.