The parameters for creating a Gemini instance.
ReadonlymodelStatic ReadonlysupportedA list of model name patterns that are supported by this LLM.
Connects to the Gemini model and returns an llm connection.
LlmRequest, the request to send to the Gemini model.
BaseLlmConnection, the connection to the Gemini model.
Sends a request to the Gemini model.
LlmRequest, the request to send to the Gemini model.
bool = false, whether to do streaming call.
Appends a user content, so that model can continue to output.
LlmRequest, the request to send to the LLM.
Integration for Gemini models.