Payara Micro Integration
LangChain4j can be seamlessly integrated into a Payara Micro application, leveraging the standard features of Jakarta EE and MicroProfile for dependency injection and configuration.
This guide demonstrates how to create JAX-RS resources that directly instantiate and use LangChain4j models, with the configuration managed by MicroProfile Config.
Maven Dependencies
First, add the core langchain4j dependency and the specific model integration modules you need to your pom.xml file:
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.release>21</maven.compiler.release>
<jakartaee-api.version>10.0.0</jakartaee-api.version>
<payara.version>6.2025.5</payara.version>
<version.langchain4j>1.1.0</version.langchain4j>
</properties>
<dependencies>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
<version>${version.langchain4j}</version>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-google-ai-gemini</artifactId>
<version>${version.langchain4j}-rc1</version>
</dependency>
<dependency>
<groupId>jakarta.platform</groupId>
<artifactId>jakarta.jakartaee-api</artifactId>
<version>${jakartaee-api.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
Configuring Multiple Models
You can configure multiple AI models by providing their respective properties in the MicroProfile configuration file, located at src/main/resources/META-INF/microprofile-config.properties
as follows:
openai.api.key=${OPENAI_API_KEY}
openai.chat.model=gpt-4o-mini
google-ai-gemini.chat-model.api-key=${GEMINI_KEY}
google-ai-gemini.chat-model.model-name=gemini-2.0-flash-lite
deepseek.api.key=${DEEPSEEK_API_KEY}
deepseek.chat.model=deepseek-reasoner
Implementing the Chat Resources
In this approach, each JAX-RS resource is responsible for its own AI model instance. The pattern is replicated for each provider.
RestConfiguration
class extends jakarta.ws.rs.core.Application
and define the base path /api
for all our REST endpoints:
import jakarta.ws.rs.ApplicationPath;
import jakarta.ws.rs.core.Application;
@ApplicationPath("api")
public class RestConfiguration extends Application {
}
For each model, we have a JAX-RS Resource class which:
- Inject the configuration properties using
@Inject
and@ConfigProperty
. - Use a method with the
@PostConstruct
annotation to build the model instance after the properties have been injected. - Create a
@GET
endpoint to interact with the model.
@Path("openai")
public class OpenAiChatModelResource {
@Inject
@ConfigProperty(name = "openai.api.key")
private String openAiApiKey;
@Inject
@ConfigProperty(name = "openai.chat.model")
private String modelName;
private OpenAiChatModel chatModel;
@PostConstruct
public void init() {
chatModel = OpenAiChatModel.builder()
.apiKey(openAiApiKey)
.modelName(modelName)
.build();
}
@GET
@Path("chat")
@Produces(MediaType.TEXT_PLAIN)
public String chat(@QueryParam("message") @DefaultValue("Hello") String message) {
return chatModel.generate(message);
}
}
This same pattern is used for GeminiChatModelResource
and DeepSeekChatModelResource
.
Note that the last one reuses the OpenAiChatModel
class, only changing the baseUrl to deepseek API, demonstrating the library's flexibility.
API Documentation
The example project includes a Swagger UI to interactively explore and test the API endpoints.
The index.html
file in the webapp folder configures the Swagger UI to load the OpenAPI specification that is automatically generated by Payara Micro at the /openapi
endpoint:
openapi: 3.0.0
info:
title: Deployed Resources
version: 1.0.0
...
endpoints:
/:
- /api/deepseek/chat
- /api/gemini/chat
- /api/openai/chat
components: {}
Running the Example Application
The project is configured to run with the Payara Micro Maven plugin.
Prerequisites:
- Java SE 21+
- Maven Execution
Execution
- Open a terminal in the project's root directory.
- Set the required environment variables for the API keys. The application needs these to authenticate with the AI services. You must configure:
- OPENAI_API_KEY
- GEMINI_KEY
- DEEPSEEK_API_KEY
- Execute the following Maven command:
mvn clean package payara-micro:start
After the server starts, you can test the endpoints in two ways:
-
If you are using IntelliJ IDEA (Ultimate Edition) or another IDE with a similar feature, you can execute requests directly from a
.http
file:a. Open the file
test.http
located insrc/test/resources/
.b. The IDE will display a small green "play" icon next to each request definition:
c. Click the icon next to the request you want to run. The response from the API will be displayed directly in the IDE's tool window:
-
Using the AI Chat Interface
Navigate to http://localhost:8080/ in your browser. This will open an interactive Chat Page where you can explore and test the available endpoints directly from your browser: