GitHub Models
This is the documentation for the GitHub Models
integration, that uses the Azure AI Inference API to access GitHub Models.
LangChain4j provides 4 different integrations with OpenAI for using chat models, and this is #4 :
- OpenAI uses a custom Java implementation of the OpenAI REST API, that works best with Quarkus (as it uses the Quarkus REST client) and Spring (as it uses Spring's RestClient).
- OpenAI Official SDK uses the official OpenAI Java SDK.
- Azure OpenAI uses the Azure SDK from Microsoft, and works best if you are using the Microsoft Java stack, including advanced Azure authentication mechanisms.
- GitHub Models uses the Azure AI Inference API to access GitHub Models.
If you want to develop a generative AI application, you can use GitHub Models to find and experiment with AI models for free. Once you are ready to bring your application to production, you can switch to a token from a paid Azure account.
GitHub Models Documentation
Maven Dependency
Plain Java
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-github-models</artifactId>
<version>1.0.0-beta3</version>
</dependency>
GitHub token
To use GitHub Models, you need to use a GitHub token for authentication.
Token are created and managed in GitHub Developer Settings > Personal access tokens.
Once you have a token, you can set it as an environment variable and use it in your code:
export GITHUB_TOKEN="<your-github-token-goes-here>"
Creating a GitHubModelsChatModel
with a GitHub token
Plain Java
GitHubModelsChatModel model = GitHubModelsChatModel.builder()
.gitHubToken(System.getenv("GITHUB_TOKEN"))
.modelName("gpt-4o-mini")
.build();
This will create an instance of GitHubModelsChatModel
.
Model parameters (e.g. temperature
) can be customized by providing values in the GitHubModelsChatModel
's builder.
Spring Boot
Create a GitHubModelsChatModelConfiguration
Spring Bean:
package com.example.demo.configuration.github;
import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.github.GitHubModelsChatModel;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Profile;
@Configuration
@Profile("github")
public class GitHubModelsChatModelConfiguration {
@Value("${GITHUB_TOKEN}")
private String gitHubToken;
@Bean
ChatLanguageModel gitHubModelsChatLanguageModel() {
return GitHubModelsChatModel.builder()
.gitHubToken(gitHubToken)
.modelName("gpt-4o-mini")
.logRequestsAndResponses(true)
.build();
}
}
This configuration will create an GitHubModelsChatModel
bean,
which can be either used by an AI Service
or autowired where needed, for example:
@RestController
class ChatLanguageModelController {
ChatLanguageModel chatLanguageModel;
ChatLanguageModelController(ChatLanguageModel chatLanguageModel) {
this.chatLanguageModel = chatLanguageModel;
}
@GetMapping("/model")
public String model(@RequestParam(value = "message", defaultValue = "Hello") String message) {
return chatLanguageModel.chat(message);
}
}
Creating a GitHubModelsStreamingChatModel
with a GitHub token
Plain Java
GitHubModelsStreamingChatModel model = GitHubModelsStreamingChatModel.builder()
.gitHubToken(System.getenv("GITHUB_TOKEN"))
.modelName("gpt-4o-mini")
.logRequestsAndResponses(true)
.build();
Spring Boot
Create a GitHubModelsStreamingChatModelConfiguration
Spring Bean:
package com.example.demo.configuration.github;
import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.github.GitHubModelsChatModel;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Profile;
@Configuration
@Profile("github")
public class GitHubModelsStreamingChatModelConfiguration {
@Value("${GITHUB_TOKEN}")
private String gitHubToken;
@Bean
GitHubModelsStreamingChatModel gitHubModelsStreamingChatLanguageModel() {
return GitHubModelsStreamingChatModel.builder()
.gitHubToken(System.getenv("GITHUB_TOKEN"))
.modelName("gpt-4o-mini")
.logRequestsAndResponses(true)
.build();
}
}