Class Gemini
java.lang.Object
com.google.adk.models.BaseLlm
com.google.adk.models.Gemini
Represents the Gemini Generative AI model.
This class provides methods for interacting with the Gemini model, including standard request-response generation and establishing persistent bidirectional connections.
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionGemini(String modelName, VertexCredentials vertexCredentials) Constructs a new Gemini instance with a Google Gemini API key.Constructs a new Gemini instance.Constructs a new Gemini instance with a Google Gemini API key. -
Method Summary
Modifier and TypeMethodDescriptionstatic Gemini.Builderbuilder()Returns a new Builder instance for constructing Gemini objects.connect(LlmRequest llmRequest) Creates a live connection to the LLM.io.reactivex.rxjava3.core.Flowable<LlmResponse> generateContent(LlmRequest llmRequest, boolean stream) Generates one content from the given LLM request and tools.
-
Constructor Details
-
Gemini
Constructs a new Gemini instance.- Parameters:
modelName- The name of the Gemini model to use (e.g., "gemini-2.0-flash").apiClient- The genaiClientinstance for making API calls.
-
Gemini
-
Gemini
Constructs a new Gemini instance with a Google Gemini API key.- Parameters:
modelName- The name of the Gemini model to use (e.g., "gemini-2.0-flash").vertexCredentials- The Vertex AI credentials to access the Gemini model.
-
-
Method Details
-
builder
Returns a new Builder instance for constructing Gemini objects. Note that when building a Gemini object, at least one of apiKey, vertexCredentials, or an explicit apiClient must be set. If multiple are set, the explicit apiClient will take precedence.- Returns:
- A new
Gemini.Builder.
-
generateContent
public io.reactivex.rxjava3.core.Flowable<LlmResponse> generateContent(LlmRequest llmRequest, boolean stream) Description copied from class:BaseLlmGenerates one content from the given LLM request and tools.- Specified by:
generateContentin classBaseLlm- Parameters:
llmRequest- The LLM request containing the input prompt and parameters.stream- A boolean flag indicating whether to stream the response.- Returns:
- A Flowable of LlmResponses. For non-streaming calls, it will only yield one LlmResponse. For streaming calls, it may yield more than one LlmResponse, but all yielded LlmResponses should be treated as one content by merging their parts.
-
connect
Description copied from class:BaseLlmCreates a live connection to the LLM.
-