Deprecated API
Contents
-
Terminally Deprecated ElementsElementDescriptionPlease use another constructor with a new
ContentRetriever
instead.Use this instead:.retrievalAugmentor(DefaultRetrievalAugmentor.builder() .contentInjector(DefaultContentInjector.builder() .promptTemplate(promptTemplate) .build()) .build());
as of 0.31.0, useDocument.metadata()
and thenMetadata.getString(String)
,Metadata.getInteger(String)
,Metadata.getLong(String)
,Metadata.getFloat(String)
,Metadata.getDouble(String)
instead.as of 0.31.0, useMetadata.put(String, String)
,Metadata.put(String, int)
,Metadata.put(String, long)
,Metadata.put(String, float)
,Metadata.put(String, double)
instead.as of 0.31.0, useMetadata.put(String, String)
,Metadata.put(String, int)
,Metadata.put(String, long)
,Metadata.put(String, float)
,Metadata.put(String, double)
instead.as of 0.31.0, useMetadata.toMap()
instead.UseMetadata.from(String, String)
insteadas of 0.31.0, useMetadata.getString(String)
,Metadata.getInteger(String)
,Metadata.getLong(String)
,Metadata.getFloat(String)
,Metadata.getDouble(String)
instead.UseMetadata.metadata(String, String)
insteadUse the constructor with suppliers for Tika components if you intend to use this parser for multiple files and specify whether to include metadata or not.Use the constructor with suppliers for Tika components if you intend to use this parser for multiple files.UseUserMessage.singleText()
orUserMessage.contents()
instead.as of 0.31.0, useTextSegment.metadata()
and thenMetadata.getString(String)
,Metadata.getInteger(String)
,Metadata.getLong(String)
,Metadata.getFloat(String)
,Metadata.getDouble(String)
instead.UseUtils.isNullOrEmpty(Collection)
instead.For JSON output, you can replace `.responseFormat(new ChatCompletionsJsonResponseFormat())` with a `JsonSchema` in the `ResponseFormat`. You can then use `.strictJsonSchema(true)`to force JSON schema adherence.Please useAzureOpenAiStreamingChatModel.Builder.openAIAsyncClient(OpenAIAsyncClient)
instead, if you require response streaming. Please useAzureOpenAiChatModel
instead, if you require sync responses.For JSON output, you can replace `.responseFormat(new ChatCompletionsJsonResponseFormat())` with a `JsonSchema` in the `ResponseFormat`. You can then use `.strictJsonSchema(true)`to force JSON schema adherence.If you want to continue using sync client, useAzureOpenAiChatModel
instead.please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
please useBedrockChatModel
please useBedrockChatModel
please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
insteadWill be removed in the next release, this functionality will not be supported anymore. Please reach out (via GitHub issues) if you use it.Will be removed in the next release, this functionality will not be supported anymore. Please reach out (via GitHub issues) if you use it.please useBedrockChatModel
insteadplease useBedrockChatModel
please useChatModelErrorContext(Throwable, ChatRequest, ModelProvider, Map)
insteadplease useChatModelErrorContext(Throwable, ChatRequest, ModelProvider, Map)
insteadplease useChatModelErrorContext(Throwable, ChatRequest, ModelProvider, Map)
insteadPartial response will not be available in future versions to simplify the design and implementation. Please reach out if you have any concerns.please useChatModelErrorContext.chatRequest()
insteadin favour ofChatRequest
please useChatModelRequestContext(ChatRequest, ModelProvider, Map)
insteadplease useChatModelRequestContext(ChatRequest, ModelProvider, Map)
insteadplease useChatModelRequestContext(ChatRequest, ModelProvider, Map)
insteadplease useChatModelRequestContext.chatRequest()
insteadin favour ofChatResponse
please useChatModelResponseContext(ChatResponse, ChatRequest, ModelProvider, Map)
insteadplease useChatModelResponseContext(ChatResponse, ChatRequest, ModelProvider, Map)
insteadplease useChatModelResponseContext(ChatResponse, ChatRequest, ModelProvider, Map)
insteadplease useChatModelResponseContext.chatRequest()
insteadplease useChatModelResponseContext.chatResponse()
insteadsince it has the misleading side effect of deleting all other properties eventually added so far. UseJsonObjectSchema.Builder.addProperties(Map)
instead.Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters.Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters.please useOpenAiChatModel.defaultRequestParameters()
and thenChatRequestParameters.modelName()
insteadPlease usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default values for the model name and temperature will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default value for the model name will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default value for the model name will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default values for the model name and temperature will be removed in future releases!use one of the following enums instead:OpenAiChatModelName
,OpenAiEmbeddingModelName
OpenAiImageModelName
,OpenAiLanguageModelName
,OpenAiModerationModelName
Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default value for the model name will be removed in future releases!please useOpenAiStreamingChatModel.defaultRequestParameters()
and thenChatRequestParameters.modelName()
insteadPlease usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default values for the model name and temperature will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default values for the model name and temperature will be removed in future releases!Please use other constructors and specify the model name explicitly.Please usebuilder()
instead, and explicitly set the baseUrl and, if necessary, other parameters. The default value for baseUrl will be removed in future releases!useEmbeddingStoreContentRetriever
instead.Please useContentRetriever
instead.UseAiServices.contentRetriever(ContentRetriever)
(e.g.EmbeddingStoreContentRetriever
) instead.
Configures a retriever that will be invoked on every method call to fetch relevant information related to the current user message from an underlying source (e.g., embedding store). This relevant information is automatically injected into the message sent to the LLM.dev.langchain4j.store.embedding.elasticsearch.ElasticsearchEmbeddingStore.Builder.dimension(Integer) dimension is not used anymore.as of 0.31.0, useEmbeddingStore.search(EmbeddingSearchRequest)
instead.as of 0.31.0, useEmbeddingStore.search(EmbeddingSearchRequest)
instead.as of 0.31.0, useEmbeddingStore.search(EmbeddingSearchRequest)
instead.as of 0.31.0, useEmbeddingStore.search(EmbeddingSearchRequest)
instead.
-
Deprecated Interfaces
-
Deprecated ClassesClassDescriptionplease use
BedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
please useBedrockChatModel
please useBedrockChatModel
please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
insteadplease useBedrockChatModel
please useBedrockChatModel
insteadWill be removed in the next release, this functionality will not be supported anymore. Please reach out (via GitHub issues) if you use it.Will be removed in the next release, this functionality will not be supported anymore. Please reach out (via GitHub issues) if you use it.please useBedrockChatModel
insteadplease useBedrockChatModel
in favour ofChatRequest
in favour ofChatResponse
use one of the following enums instead:OpenAiChatModelName
,OpenAiEmbeddingModelName
OpenAiImageModelName
,OpenAiLanguageModelName
,OpenAiModerationModelName
useEmbeddingStoreContentRetriever
instead.
-
Deprecated MethodsMethodDescriptionUse this instead:
.retrievalAugmentor(DefaultRetrievalAugmentor.builder() .contentInjector(DefaultContentInjector.builder() .promptTemplate(promptTemplate) .build()) .build());
as of 0.31.0, useDocument.metadata()
and thenMetadata.getString(String)
,Metadata.getInteger(String)
,Metadata.getLong(String)
,Metadata.getFloat(String)
,Metadata.getDouble(String)
instead.as of 0.31.0, useMetadata.put(String, String)
,Metadata.put(String, int)
,Metadata.put(String, long)
,Metadata.put(String, float)
,Metadata.put(String, double)
instead.as of 0.31.0, useMetadata.put(String, String)
,Metadata.put(String, int)
,Metadata.put(String, long)
,Metadata.put(String, float)
,Metadata.put(String, double)
instead.as of 0.31.0, useMetadata.toMap()
instead.UseMetadata.from(String, String)
insteadas of 0.31.0, useMetadata.getString(String)
,Metadata.getInteger(String)
,Metadata.getLong(String)
,Metadata.getFloat(String)
,Metadata.getDouble(String)
instead.UseMetadata.metadata(String, String)
insteadUseUserMessage.singleText()
orUserMessage.contents()
instead.as of 0.31.0, useTextSegment.metadata()
and thenMetadata.getString(String)
,Metadata.getInteger(String)
,Metadata.getLong(String)
,Metadata.getFloat(String)
,Metadata.getDouble(String)
instead.UseUtils.isNullOrEmpty(Collection)
instead.For JSON output, you can replace `.responseFormat(new ChatCompletionsJsonResponseFormat())` with a `JsonSchema` in the `ResponseFormat`. You can then use `.strictJsonSchema(true)`to force JSON schema adherence.Please useAzureOpenAiStreamingChatModel.Builder.openAIAsyncClient(OpenAIAsyncClient)
instead, if you require response streaming. Please useAzureOpenAiChatModel
instead, if you require sync responses.For JSON output, you can replace `.responseFormat(new ChatCompletionsJsonResponseFormat())` with a `JsonSchema` in the `ResponseFormat`. You can then use `.strictJsonSchema(true)`to force JSON schema adherence.If you want to continue using sync client, useAzureOpenAiChatModel
instead.Partial response will not be available in future versions to simplify the design and implementation. Please reach out if you have any concerns.please useChatModelErrorContext.chatRequest()
insteadplease useChatModelRequestContext.chatRequest()
insteadplease useChatModelResponseContext.chatRequest()
insteadplease useChatModelResponseContext.chatResponse()
insteadsince it has the misleading side effect of deleting all other properties eventually added so far. UseJsonObjectSchema.Builder.addProperties(Map)
instead.Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters.Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters.Please useOllamaChatModel.OllamaChatModelBuilder.responseFormat(ResponseFormat)
instead. For example:responseFormat(ResponseFormat.JSON)
.
Instead of using JSON mode, consider using structured outputs with JSON schema instead, see more info here.Please useOllamaLanguageModel.OllamaLanguageModelBuilder.responseFormat(ResponseFormat)
instead. For example:responseFormat(ResponseFormat.JSON)
.
Instead of using JSON mode, consider using structured outputs with JSON schema instead, see more info here.dev.langchain4j.model.ollama.OllamaStreamingChatModel.OllamaStreamingChatModelBuilder.format(String) Please useOllamaStreamingChatModel.OllamaStreamingChatModelBuilder.responseFormat(ResponseFormat)
instead. For example:responseFormat(ResponseFormat.JSON)
.
Instead of using JSON mode, consider using structured outputs with JSON schema instead, see more info here.Please useOllamaStreamingLanguageModel.OllamaStreamingLanguageModelBuilder.responseFormat(ResponseFormat)
instead. For example:responseFormat(ResponseFormat.JSON)
.
Instead of using JSON mode, consider using structured outputs with JSON schema instead, see more info here.Functions are deprecated by OpenAI, useInternalOpenAiHelper.toTools(Collection, boolean)
insteadplease useOpenAiChatModel.defaultRequestParameters()
and thenChatRequestParameters.modelName()
insteadPlease usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default values for the model name and temperature will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default value for the model name will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default value for the model name will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default values for the model name and temperature will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default value for the model name will be removed in future releases!please useOpenAiStreamingChatModel.defaultRequestParameters()
and thenChatRequestParameters.modelName()
insteadPlease usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default values for the model name and temperature will be removed in future releases!Please usebuilder()
instead, and explicitly set the model name and, if necessary, other parameters. The default values for the model name and temperature will be removed in future releases!Please usebuilder()
instead, and explicitly set the baseUrl and, if necessary, other parameters. The default value for baseUrl will be removed in future releases!Use/implementContentInjector.inject(List, ChatMessage)
instead.dev.langchain4j.rag.content.injector.DefaultContentInjector.createPrompt(UserMessage, List<Content>) implement/overrideDefaultContentInjector.createPrompt(ChatMessage, List)
instead.useDefaultContentInjector.inject(List, ChatMessage)
instead.Use/implementRetrievalAugmentor.augment(AugmentationRequest)
instead.UseAiServices.contentRetriever(ContentRetriever)
(e.g.EmbeddingStoreContentRetriever
) instead.
Configures a retriever that will be invoked on every method call to fetch relevant information related to the current user message from an underlying source (e.g., embedding store). This relevant information is automatically injected into the message sent to the LLM.dev.langchain4j.store.embedding.elasticsearch.ElasticsearchEmbeddingStore.Builder.dimension(Integer) dimension is not used anymore.as of 0.31.0, useEmbeddingStore.search(EmbeddingSearchRequest)
instead.as of 0.31.0, useEmbeddingStore.search(EmbeddingSearchRequest)
instead.as of 0.31.0, useEmbeddingStore.search(EmbeddingSearchRequest)
instead.as of 0.31.0, useEmbeddingStore.search(EmbeddingSearchRequest)
instead.
-
Deprecated ConstructorsConstructorDescriptionPlease use another constructor with a new
ContentRetriever
instead.Use the constructor with suppliers for Tika components if you intend to use this parser for multiple files and specify whether to include metadata or not.Use the constructor with suppliers for Tika components if you intend to use this parser for multiple files.please useChatModelErrorContext(Throwable, ChatRequest, ModelProvider, Map)
insteadplease useChatModelErrorContext(Throwable, ChatRequest, ModelProvider, Map)
insteadplease useChatModelErrorContext(Throwable, ChatRequest, ModelProvider, Map)
insteadplease useChatModelRequestContext(ChatRequest, ModelProvider, Map)
insteadplease useChatModelRequestContext(ChatRequest, ModelProvider, Map)
insteadplease useChatModelRequestContext(ChatRequest, ModelProvider, Map)
insteadplease useChatModelResponseContext(ChatResponse, ChatRequest, ModelProvider, Map)
insteadplease useChatModelResponseContext(ChatResponse, ChatRequest, ModelProvider, Map)
insteadplease useChatModelResponseContext(ChatResponse, ChatRequest, ModelProvider, Map)
insteadPlease use other constructors and specify the model name explicitly.
-
Deprecated Enum Constants