Class TokenWindowChatMemory

java.lang.Object
dev.langchain4j.memory.chat.TokenWindowChatMemory
All Implemented Interfaces:
ChatMemory

public class TokenWindowChatMemory extends Object implements ChatMemory
This chat memory operates as a sliding window of maxTokens tokens. It retains as many of the most recent messages as can fit into the window. If there isn't enough space for a new message, the oldest one (or multiple) is evicted. Messages are indivisible. If a message doesn't fit, it is evicted completely.

Once added, a SystemMessage is always retained. Only one SystemMessage can be held at a time. If a new SystemMessage with the same content is added, it is ignored. If a new SystemMessage with different content is added, the previous SystemMessage is removed.

If an AiMessage containing ToolExecutionRequest(s) is evicted, the following orphan ToolExecutionResultMessage(s) are also automatically evicted to avoid problems with some LLM providers (such as OpenAI) that prohibit sending orphan ToolExecutionResultMessage(s) in the request.

The state of chat memory is stored in ChatMemoryStore (InMemoryChatMemoryStore is used by default).