Skip to main content

Spring Boot Integration

LangChain4j provides Spring Boot starters for:

Spring Boot starters help with creating and configuring language models, embedding models, embedding stores, and other core LangChain4j components through properties.

To use one of the Spring Boot starters, import the corresponding dependency.

The naming convention for the Spring Boot starter dependency is: langchain4j-{integration-name}-spring-boot-starter.

For example, for OpenAI (langchain4j-open-ai), the dependency name would be langchain4j-open-ai-spring-boot-starter:

<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai-spring-boot-starter</artifactId>
<version>0.36.2</version>
</dependency>

Then, you can configure model parameters in the application.properties file as follows:

langchain4j.open-ai.chat-model.api-key=${OPENAI_API_KEY}
langchain4j.open-ai.chat-model.model-name=gpt-4o
langchain4j.open-ai.chat-model.log-requests=true
langchain4j.open-ai.chat-model.log-responses=true
...

In this case, an instance of OpenAiChatModel (an implementation of a ChatLanguageModel) will be automatically created, and you can autowire it where needed:

@RestController
public class ChatController {

ChatLanguageModel chatLanguageModel;

public ChatController(ChatLanguageModel chatLanguageModel) {
this.chatLanguageModel = chatLanguageModel;
}

@GetMapping("/chat")
public String model(@RequestParam(value = "message", defaultValue = "Hello") String message) {
return chatLanguageModel.generate(message);
}
}

If you need an instance of a StreamingChatLanguageModel, use the streaming-chat-model instead of the chat-model properties:

langchain4j.open-ai.streaming-chat-model.api-key=${OPENAI_API_KEY}
...

Spring Boot starter for declarative AI Services

LangChain4j provides a Spring Boot starter for auto-configuring AI Services, RAG, Tools etc.

Assuming you have already imported one of the integrations starters (see above), import langchain4j-spring-boot-starter:

<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-spring-boot-starter</artifactId>
<version>0.36.2</version>
</dependency>

You can now define AI Service interface and annotate it with @AiService:

@AiService
interface Assistant {

@SystemMessage("You are a polite assistant")
String chat(String userMessage);
}

Think of it as a standard Spring Boot @Service, but with AI capabilities.

When the application starts, LangChain4j starter will scan the classpath and find all interfaces annotated with @AiService. For each AI Service found, it will create an implementation of this interface using all LangChain4j components available in the application context and will register it as a bean, so you can auto-wire it where needed:

@RestController
class AssistantController {

@Autowired
Assistant assistant;

@GetMapping("/chat")
public String chat(String message) {
return assistant.chat(message);
}
}

Automatic Component Wiring

The following components will be automatically wired into the AI Service if available in the application context:

  • ChatLanguageModel
  • StreamingChatLanguageModel
  • ChatMemory
  • ChatMemoryProvider
  • ContentRetriever
  • RetrievalAugmentor
  • All methods of any @Component or @Service class that are annotated with @Tool An example:
@Component
public class BookingTools {

private final BookingService bookingService;

public BookingTools(BookingService bookingService) {
this.bookingService = bookingService;
}

@Tool
public Booking getBookingDetails(String bookingNumber, String customerName, String customerSurname) {
return bookingService.getBookingDetails(bookingNumber, customerName, customerSurname);
}

@Tool
public void cancelBooking(String bookingNumber, String customerName, String customerSurname) {
bookingService.cancelBooking(bookingNumber, customerName, customerSurname);
}
}
note

If multiple components of the same type are present in the application context, the application will fail to start. In this case, use the explicit wiring mode (explained below).

Explicit Component Wiring

If you have multiple AI Services and want to wire different LangChain4j components into each of them, you can specify which components to use with explicit wiring mode (@AiService(wiringMode = EXPLICIT)).

Let's say we have two ChatLanguageModels configured:

# OpenAI
langchain4j.open-ai.chat-model.api-key=${OPENAI_API_KEY}
langchain4j.open-ai.chat-model.model-name=gpt-4o-mini

# Ollama
langchain4j.ollama.chat-model.base-url=http://localhost:11434
langchain4j.ollama.chat-model.model-name=llama3.1
@AiService(wiringMode = EXPLICIT, chatModel = "openAiChatModel")
interface OpenAiAssistant {

@SystemMessage("You are a polite assistant")
String chat(String userMessage);
}

@AiService(wiringMode = EXPLICIT, chatModel = "ollamaChatModel")
interface OllamaAssistant {

@SystemMessage("You are a polite assistant")
String chat(String userMessage);
}
note

In this case, you must explicitly specify all components.

More details can be found here.

Flux

When streaming, you can use Flux<String> as a return type of AI Service:

@AiService
interface Assistant {

@SystemMessage("You are a polite assistant")
Flux<String> chat(String userMessage);
}

For this, please import langchain4j-reactor module. See more details here.

Supported versions

LangChain4j Spring Boot integration requires Java 17 and Spring Boot 3.2.

Examples