Interface ChatRequestParameters

All Known Implementing Classes:
DefaultChatRequestParameters, OpenAiChatRequestParameters

public interface ChatRequestParameters
Represents common chat request parameters supported by most LLM providers. Specific LLM provider integrations can extend this interface to add provider-specific parameters.
See Also:
  • Method Details

    • modelName

      String modelName()
    • temperature

      Double temperature()
    • topP

      Double topP()
    • topK

      Integer topK()
    • frequencyPenalty

      Double frequencyPenalty()
    • presencePenalty

      Double presencePenalty()
    • maxOutputTokens

      Integer maxOutputTokens()
    • stopSequences

      List<String> stopSequences()
    • toolSpecifications

      List<ToolSpecification> toolSpecifications()
    • toolChoice

      ToolChoice toolChoice()
    • responseFormat

      ResponseFormat responseFormat()
    • builder

    • overrideWith

      ChatRequestParameters overrideWith(ChatRequestParameters parameters)
      Creates a new ChatRequestParameters by combining the current parameters with the specified ones. Values from the specified parameters override values from the current parameters when there is overlap. Neither the current nor the specified ChatRequestParameters objects are modified.

      Example:

       Current parameters:
         temperature = 1.0
         maxOutputTokens = 100
      
       Specified parameters:
         temperature = 0.5
         modelName = my-model
      
       Result:
         temperature = 0.5        // Overridden from specified
         maxOutputTokens = 100    // Preserved from current
         modelName = my-model     // Added from specified
       
      Parameters:
      parameters - the parameters whose values will override the current ones
      Returns:
      a new ChatRequestParameters instance combining both sets of parameters