Uses of Interface
dev.langchain4j.model.Tokenizer
Packages that use Tokenizer
Package
Description
-
Uses of Tokenizer in dev.langchain4j.data.document.splitter
Fields in dev.langchain4j.data.document.splitter declared as TokenizerModifier and TypeFieldDescriptionprotected final Tokenizer
HierarchicalDocumentSplitter.tokenizer
Methods in dev.langchain4j.data.document.splitter with parameters of type TokenizerModifier and TypeMethodDescriptionstatic DocumentSplitter
DocumentSplitters.recursive
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer) This is a recommendedDocumentSplitter
for generic text.Constructors in dev.langchain4j.data.document.splitter with parameters of type TokenizerModifierConstructorDescriptionDocumentByCharacterSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer) DocumentByCharacterSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer, DocumentSplitter subSplitter) DocumentByLineSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer) DocumentByLineSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer, DocumentSplitter subSplitter) DocumentByParagraphSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer) DocumentByParagraphSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer, DocumentSplitter subSplitter) DocumentByRegexSplitter
(String regex, String joinDelimiter, int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer) DocumentByRegexSplitter
(String regex, String joinDelimiter, int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer, DocumentSplitter subSplitter) DocumentBySentenceSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer) DocumentBySentenceSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer, DocumentSplitter subSplitter) DocumentByWordSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer) DocumentByWordSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer, DocumentSplitter subSplitter) protected
HierarchicalDocumentSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer) Creates a new instance ofHierarchicalDocumentSplitter
.protected
HierarchicalDocumentSplitter
(int maxSegmentSizeInTokens, int maxOverlapSizeInTokens, Tokenizer tokenizer, DocumentSplitter subSplitter) Creates a new instance ofHierarchicalDocumentSplitter
. -
Uses of Tokenizer in dev.langchain4j.memory.chat
Methods in dev.langchain4j.memory.chat with parameters of type TokenizerModifier and TypeMethodDescriptionstatic TokenWindowChatMemory
TokenWindowChatMemory.withMaxTokens
(int maxTokens, Tokenizer tokenizer) -
Uses of Tokenizer in dev.langchain4j.model.azure
Classes in dev.langchain4j.model.azure that implement TokenizerModifier and TypeClassDescriptionclass
This class can be used to estimate the cost (in tokens) before calling OpenAI or when using streaming.Methods in dev.langchain4j.model.azure with parameters of type TokenizerModifier and TypeMethodDescriptionConstructors in dev.langchain4j.model.azure with parameters of type TokenizerModifierConstructorDescriptionAzureOpenAiChatModel
(com.azure.ai.openai.OpenAIClient client, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, List<ChatModelListener> listeners, Set<Capability> capabilities) AzureOpenAiChatModel
(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiChatModel
(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiChatModel
(String endpoint, String serviceVersion, String apiKey, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiEmbeddingModel
(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, String deploymentName, Tokenizer tokenizer, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Integer dimensions, Map<String, String> customHeaders) AzureOpenAiEmbeddingModel
(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, String deploymentName, Tokenizer tokenizer, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Integer dimensions, Map<String, String> customHeaders) AzureOpenAiEmbeddingModel
(String endpoint, String serviceVersion, String apiKey, String deploymentName, Tokenizer tokenizer, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Integer dimensions, Map<String, String> customHeaders) AzureOpenAiLanguageModel
(com.azure.ai.openai.OpenAIClient client, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Integer bestOf) AzureOpenAiLanguageModel
(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Integer bestOf, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) AzureOpenAiLanguageModel
(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Integer bestOf, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) AzureOpenAiLanguageModel
(String endpoint, String serviceVersion, String apiKey, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Integer bestOf, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) AzureOpenAiStreamingChatModel
(com.azure.ai.openai.OpenAIClient client, com.azure.ai.openai.OpenAIAsyncClient asyncClient, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, List<ChatModelListener> listeners, Set<Capability> capabilities) AzureOpenAiStreamingChatModel
(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiStreamingChatModel
(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiStreamingChatModel
(String endpoint, String serviceVersion, String apiKey, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, List<String> stop, Double presencePenalty, Double frequencyPenalty, List<com.azure.ai.openai.models.AzureChatExtensionConfiguration> dataSources, com.azure.ai.openai.models.AzureChatEnhancementConfiguration enhancements, Long seed, com.azure.ai.openai.models.ChatCompletionsResponseFormat chatCompletionsResponseFormat, ResponseFormat responseFormat, Boolean strictJsonSchema, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, boolean useAsyncClient, List<ChatModelListener> listeners, String userAgentSuffix, Map<String, String> customHeaders, Set<Capability> capabilities) AzureOpenAiStreamingLanguageModel
(com.azure.ai.openai.OpenAIClient client, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty) AzureOpenAiStreamingLanguageModel
(String endpoint, String serviceVersion, com.azure.core.credential.KeyCredential keyCredential, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) AzureOpenAiStreamingLanguageModel
(String endpoint, String serviceVersion, com.azure.core.credential.TokenCredential tokenCredential, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) AzureOpenAiStreamingLanguageModel
(String endpoint, String serviceVersion, String apiKey, String deploymentName, Tokenizer tokenizer, Integer maxTokens, Double temperature, Double topP, Map<String, Integer> logitBias, String user, Integer logprobs, Boolean echo, List<String> stop, Double presencePenalty, Double frequencyPenalty, Duration timeout, Integer maxRetries, com.azure.core.http.ProxyOptions proxyOptions, boolean logRequestsAndResponses, String userAgentSuffix, Map<String, String> customHeaders) -
Uses of Tokenizer in dev.langchain4j.model.googleai
Classes in dev.langchain4j.model.googleai that implement Tokenizer -
Uses of Tokenizer in dev.langchain4j.model.openai
Classes in dev.langchain4j.model.openai that implement TokenizerModifier and TypeClassDescriptionclass
This class can be used to estimate the cost (in tokens) before calling OpenAI or when using streaming.Methods in dev.langchain4j.model.openai with parameters of type TokenizerModifier and TypeMethodDescription -
Uses of Tokenizer in dev.langchain4j.model.openaiofficial
Fields in dev.langchain4j.model.openaiofficial declared as TokenizerModifier and TypeFieldDescriptionprotected Tokenizer
OpenAiOfficialBaseChatModel.tokenizer
protected Tokenizer
OpenAiOfficialBaseChatModel.tokenizer
Methods in dev.langchain4j.model.openaiofficial with parameters of type TokenizerModifier and TypeMethodDescription