Annotation Interface Tool


@Retention(RUNTIME) @Target(METHOD) public @interface Tool
Java methods annotated with @Tool are considered tools/functions that language model can execute/call. Tool/function calling LLM capability (e.g., see OpenAI function calling documentation) is used under the hood. When used together with AiServices, a low-level ToolSpecification will be automatically created from the method signature (e.g. method name, method parameters (names and types), @Tool and @P annotations, etc.) and will be sent to the LLM. If the LLM decides to call the tool, the arguments will be parsed, and the method will be called automatically. If the return type of the method annotated with @Tool is String, the returned value will be sent to the LLM as-is. If the return type is void, "Success" string will be sent to the LLM. In all other cases, the returned value will be serialized into a JSON string and sent to the LLM.
  • Optional Element Summary

    Optional Elements
    Modifier and Type
    Optional Element
    Description
    A valid JSON string that contains LLM-provider-specific tool metadata entries.
    Name of the tool.
    Return behavior of the tool.
    Description of the tool.
  • Element Details

    • name

      String name
      Name of the tool. If not provided, method name will be used.
      Returns:
      name of the tool.
      Default:
      ""
    • value

      String[] value
      Description of the tool. It should be clear and descriptive to allow language model to understand the tool's purpose and its intended use.
      Returns:
      description of the tool.
      Default:
      {""}
    • returnBehavior

      Return behavior of the tool. - If ReturnBehavior.TO_LLM is used (default), the value returned by the tool is sent back to the LLM for further processing. - If ReturnBehavior.IMMEDIATE is used, returns immediately to the caller the value returned by the tool without allowing the LLM to further processing it. Immediate return is only allowed on AI services returning dev.langchain4j.service.Result, while a RuntimeException will be thrown attempting to use a tool with immediate return with an AI service having a different return type.
      Returns:
      return behavior of the tool.
      Default:
      TO_LLM
    • metadata

      A valid JSON string that contains LLM-provider-specific tool metadata entries. This string is parsed into a ToolSpecification.metadata() map when @Tool-annotated method is converted into ToolSpecification.

      NOTE: this metadata is not sent to the LLM provider API by default, you must explicitly specify which metadata keys should be sent when creating a ChatModel.

      NOTE: Currently, tool metadata is supported only by the langchain4j-anthropic module.

      Since:
      1.10.0
      Default:
      "{}"