Package dev.langchain4j.model.chat
Interface TokenCountEstimator
- All Known Implementing Classes:
AzureOpenAiChatModel
,AzureOpenAiStreamingChatModel
,GoogleAiGeminiChatModel
,OpenAiChatModel
,OpenAiStreamingChatModel
public interface TokenCountEstimator
Represents an interface for estimating the count of tokens in various text types such as a text, message, prompt, text segment, etc.
This can be useful when it's necessary to know in advance the cost of processing a specified text by the LLM.
-
Method Summary
Modifier and TypeMethodDescriptiondefault int
estimateTokenCount
(UserMessage userMessage) Estimates the count of tokens in the specified message.default int
estimateTokenCount
(TextSegment textSegment) Estimates the count of tokens in the specified text segment.default int
estimateTokenCount
(Prompt prompt) Estimates the count of tokens in the specified prompt.default int
estimateTokenCount
(String text) Estimates the count of tokens in the specified text.int
estimateTokenCount
(List<ChatMessage> messages) Estimates the count of tokens in the specified list of messages.
-
Method Details
-
estimateTokenCount
Estimates the count of tokens in the specified text.- Parameters:
text
- the text- Returns:
- the estimated count of tokens
-
estimateTokenCount
Estimates the count of tokens in the specified message.- Parameters:
userMessage
- the message- Returns:
- the estimated count of tokens
-
estimateTokenCount
Estimates the count of tokens in the specified prompt.- Parameters:
prompt
- the prompt- Returns:
- the estimated count of tokens
-
estimateTokenCount
Estimates the count of tokens in the specified text segment.- Parameters:
textSegment
- the text segment- Returns:
- the estimated count of tokens
-
estimateTokenCount
Estimates the count of tokens in the specified list of messages.- Parameters:
messages
- the list of messages- Returns:
- the estimated count of tokens
-