Class TokenWindowChatMemory

java.lang.Object
dev.langchain4j.memory.chat.TokenWindowChatMemory
All Implemented Interfaces:
ChatMemory

public class TokenWindowChatMemory extends Object implements ChatMemory
This chat memory operates as a sliding window whose size is controlled by a maxTokensProvider. It retains as many of the most recent messages as can fit into the window. If there isn't enough space for a new message, the oldest one (or multiple) is evicted. Messages are indivisible. If a message doesn't fit, it is evicted completely.

The maximum number of tokens can be supplied either statically or dynamically via the maxTokensProvider. When supplied dynamically, the effective window size may change at runtime, and the sliding-window behavior always respects the latest value returned by the provider.

The rules for SystemMessage:

  • Once added, a SystemMessage is always retained, it cannot be removed.
  • Only one SystemMessage can be held at a time.
  • If a new SystemMessage with the same content is added, it is ignored.
  • If a new SystemMessage with different content is added, the previous SystemMessage is removed. Unless TokenWindowChatMemory.Builder.alwaysKeepSystemMessageFirst(Boolean) is set to true, the new SystemMessage is added to the end of the message list.
If an AiMessage containing ToolExecutionRequest(s) is evicted, the following orphan ToolExecutionResultMessage(s) are also automatically evicted to avoid problems with some LLM providers (such as OpenAI) that prohibit sending orphan ToolExecutionResultMessage(s) in the request.

The state of chat memory is stored in ChatMemoryStore (SingleSlotChatMemoryStore is used by default).

  • Method Details

    • id

      public Object id()
      Description copied from interface: ChatMemory
      The ID of the ChatMemory.
      Specified by:
      id in interface ChatMemory
      Returns:
      The ID of the ChatMemory.
    • add

      public void add(ChatMessage message)
      Description copied from interface: ChatMemory
      Adds a message to the chat memory.
      Specified by:
      add in interface ChatMemory
      Parameters:
      message - The ChatMessage to add.
    • set

      public void set(Iterable<ChatMessage> iter)
      Description copied from interface: ChatMemory
      Replaces all messages in the chat memory with the specified messages. Unlike add, this method replaces the entire message history rather than appending to it.

      Implementations should override this method to provide more efficient atomic operations if possible. The default implementation calls ChatMemory.clear() followed by add(Iterable<ChatMessage>) which is not atomic.

      This method will typically be used when chat memory needs to be re-written to implement things like memory compaction.

      NOTE: This method is never called automatically by LangChain4j.

      Specified by:
      set in interface ChatMemory
      Parameters:
      iter - The ChatMessages to set. Must not be null or empty.
    • messages

      public List<ChatMessage> messages()
      Description copied from interface: ChatMemory
      Retrieves messages from the chat memory. Depending on the implementation, it may not return all previously added messages, but rather a subset, a summary, or a combination thereof.
      Specified by:
      messages in interface ChatMemory
      Returns:
      A list of ChatMessage objects that represent the current state of the chat memory.
    • clear

      public void clear()
      Description copied from interface: ChatMemory
      Clears the chat memory.
      Specified by:
      clear in interface ChatMemory
    • builder

      public static TokenWindowChatMemory.Builder builder()
    • withMaxTokens

      public static TokenWindowChatMemory withMaxTokens(int maxTokens, TokenCountEstimator tokenCountEstimator)