OptionalcontentThe content of the response.
OptionalcustomThe custom metadata of the LlmResponse. An optional key-value pair to label an LlmResponse. NOTE: the entire object must be JSON serializable.
OptionalerrorError code if the response is an error. Code varies by model.
OptionalerrorError message if the response is an error.
OptionalfinishThe finish reason of the response.
OptionalgroundingThe grounding metadata of the response.
OptionalinputAudio transcription of user input.
OptionalinterruptedFlag indicating that LLM was interrupted when generating the content. Usually it's due to user interruption during a bidi streaming.
OptionalliveThe session resumption update of the LlmResponse
OptionaloutputAudio transcription of model output.
OptionalpartialIndicates whether the text content is part of a unfinished text stream. Only used for streaming mode and when the content is plain text.
OptionalturnIndicates whether the response from the model is complete. Only used for streaming mode.
OptionalusageThe usage metadata of the LlmResponse.
LLM response class that provides the first candidate response from the model if available. Otherwise, returns error code and message.