Uses of Interface
dev.langchain4j.model.TokenCountEstimator
Packages that use TokenCountEstimator
Package
Description
-
Uses of TokenCountEstimator in dev.langchain4j.data.document.splitter
Fields in dev.langchain4j.data.document.splitter declared as TokenCountEstimatorModifier and TypeFieldDescriptionprotected final TokenCountEstimator
HierarchicalDocumentSplitter.tokenCountEstimator
Methods in dev.langchain4j.data.document.splitter with parameters of type TokenCountEstimatorModifier and TypeMethodDescriptionstatic DocumentSplitter
DocumentSplitters.recursive
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator) This is a recommendedDocumentSplitter
for generic text.Constructors in dev.langchain4j.data.document.splitter with parameters of type TokenCountEstimatorModifierConstructorDescriptionDocumentByCharacterSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator) DocumentByCharacterSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator, DocumentSplitter subSplitter) DocumentByLineSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator) DocumentByLineSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator, DocumentSplitter subSplitter) DocumentByParagraphSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator) DocumentByParagraphSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator, DocumentSplitter subSplitter) DocumentByRegexSplitter
(String regex, String joinDelimiter, int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator) DocumentByRegexSplitter
(String regex, String joinDelimiter, int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator, DocumentSplitter subSplitter) DocumentBySentenceSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator) DocumentBySentenceSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator, DocumentSplitter subSplitter) DocumentBySentenceSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator, DocumentSplitter subSplitter, opennlp.tools.sentdetect.SentenceModel sentenceModel) DocumentByWordSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator) DocumentByWordSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator, DocumentSplitter subSplitter) protected
HierarchicalDocumentSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator) Creates a new instance ofHierarchicalDocumentSplitter
.protected
HierarchicalDocumentSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, TokenCountEstimator tokenCountEstimator, DocumentSplitter subSplitter) Creates a new instance ofHierarchicalDocumentSplitter
. -
Uses of TokenCountEstimator in dev.langchain4j.memory.chat
Methods in dev.langchain4j.memory.chat with parameters of type TokenCountEstimatorModifier and TypeMethodDescriptionTokenWindowChatMemory.Builder.maxTokens
(Integer maxTokens, TokenCountEstimator tokenCountEstimator) static TokenWindowChatMemory
TokenWindowChatMemory.withMaxTokens
(int maxTokens, TokenCountEstimator tokenCountEstimator) -
Uses of TokenCountEstimator in dev.langchain4j.model.azure
Classes in dev.langchain4j.model.azure that implement TokenCountEstimatorModifier and TypeClassDescriptionclass
This class can be used to estimate the cost (in tokens) before calling OpenAI or when using streaming.Methods in dev.langchain4j.model.azure with parameters of type TokenCountEstimatorModifier and TypeMethodDescriptionAzureOpenAiStreamingChatModel.Builder.tokenCountEstimator
(TokenCountEstimator tokenCountEstimator) AzureOpenAiStreamingLanguageModel.Builder.tokenCountEstimator
(TokenCountEstimator tokenCountEstimator) Constructors in dev.langchain4j.model.azure with parameters of type TokenCountEstimatorModifierConstructorDescriptionAzureOpenAiStreamingChatModel
(com.azure.ai.openai.OpenAIClient client, com.azure.ai.openai.OpenAIAsyncClient asyncClient, String deploymentName, TokenCountEstimator tokenCountEstimator, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, List<ChatModelListener> listeners, Set<Capability> capabilities) AzureOpenAiStreamingChatModel
(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, TokenCountEstimator tokenCountEstimator, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiStreamingChatModel
(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, TokenCountEstimator tokenCountEstimator, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiStreamingChatModel
(String endpoint, String serviceVersion, String apiKey, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, TokenCountEstimator tokenCountEstimator, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiStreamingLanguageModel
(com.azure.ai.openai.OpenAIClient client, String deploymentName, TokenCountEstimator tokenCountEstimator, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty) AzureOpenAiStreamingLanguageModel
(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, TokenCountEstimator tokenCountEstimator, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) AzureOpenAiStreamingLanguageModel
(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, TokenCountEstimator tokenCountEstimator, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) AzureOpenAiStreamingLanguageModel
(String endpoint, String serviceVersion, String apiKey, com.azure.core.http.HttpClientProvider httpClientProvider, String deploymentName, TokenCountEstimator tokenCountEstimator, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) -
Uses of TokenCountEstimator in dev.langchain4j.model.googleai
Classes in dev.langchain4j.model.googleai that implement TokenCountEstimator -
Uses of TokenCountEstimator in dev.langchain4j.model.openai
Classes in dev.langchain4j.model.openai that implement TokenCountEstimatorModifier and TypeClassDescriptionclass
This class can be used to estimate the cost (in tokens) before calling OpenAI. -
Uses of TokenCountEstimator in dev.langchain4j.model.openaiofficial
Fields in dev.langchain4j.model.openaiofficial declared as TokenCountEstimatorModifier and TypeFieldDescriptionprotected TokenCountEstimator
OpenAiOfficialBaseChatModel.tokenCountEstimator
protected TokenCountEstimator
OpenAiOfficialBaseChatModel.tokenCountEstimator
Methods in dev.langchain4j.model.openaiofficial with parameters of type TokenCountEstimatorModifier and TypeMethodDescriptionOpenAiOfficialChatModel.Builder.tokenCountEstimator
(TokenCountEstimator tokenCountEstimator) OpenAiOfficialStreamingChatModel.Builder.tokenCountEstimator
(TokenCountEstimator tokenCountEstimator)