capacitor-plugin-gemini-x

Capacitor plugin to invoke Google's Gemini AI models on Android, iOS and Web.

Install

npm install capacitor-plugin-gemini-x
npx cap sync

API

initModel(...)

initModel(args: { params: ModelParams; }) => Promise<void>

Initialize the model with the given parameters.

Param Type
args { params: ModelParams; }

sendMessage(...)

sendMessage(args: { inputText: string; options?: PluginSendMessageOptions; }) => Promise<GeminiXResponseChunk>

Send a message to the model and return the response.

Param Type
args { inputText: string; options?: PluginSendMessageOptions; }

Returns: Promise<GeminiXResponseChunk>


countTokens(...)

countTokens(args: { inputText: string; options?: PluginCountTokensOptions; }) => Promise<GeminiXResponseCount>

Count the number of tokens in the given input text.

Param Type
args { inputText: string; options?: PluginCountTokensOptions; }

Returns: Promise<GeminiXResponseCount>


initChat(...)

initChat(args: { chatHistory?: PluginChatHistoryItem[]; }) => Promise<void>

Initialize a chat session with optional chat history.

Param Type
args { chatHistory?: PluginChatHistoryItem[]; }

sendChatMessage(...)

sendChatMessage(args: { inputText: string; options?: PluginSendMessageOptions; }) => Promise<GeminiXResponseChunk>

Send a message to the model for the current chat session and return the response.

Param Type
args { inputText: string; options?: PluginSendMessageOptions; }

Returns: Promise<GeminiXResponseChunk>


countChatTokens(...)

countChatTokens(args: { options?: PluginCountChatTokensOptions; }) => Promise<GeminiXResponseCount>

Count the number of tokens for the current chat session with optional input text and images.

Param Type
args { options?: PluginCountChatTokensOptions; }

Returns: Promise<GeminiXResponseCount>


getChatHistory()

getChatHistory() => Promise<ModelChatHistoryItem[]>

Get the chat history for the current chat session.

Returns: Promise<ModelChatHistoryItem[]>


Interfaces

ModelParams

Model parameters to be passed to initModel function.

Prop Type Description
modelName string The name of the model to be used.
apiKey string The API key to be used.
temperature number The temperature to be used for generation.
topK number The topK to be used for generation.
topP number The topP to be used for generation.
maxOutputTokens number The maximum number of tokens to be generated.
stopSequences string[] The stop sequences to be used for generation.
safetySettings SafetySettings The safety settings to be used for generation.

SafetySettings

Safety settings to be passed to initModel function.

Prop Type
[SafetySettingHarmCategory.HARASSMENT] SafetySettingLevel
[SafetySettingHarmCategory.HATE_SPEECH] SafetySettingLevel
[SafetySettingHarmCategory.SEXUALLY_EXPLICIT] SafetySettingLevel
[SafetySettingHarmCategory.DANGEROUS_CONTENT] SafetySettingLevel
[SafetySettingHarmCategory.UNSPECIFIED] SafetySettingLevel

GeminiXResponseChunk

Model response data passed to back to sendMessage and sendChatMessage functions. Also passed to event handlers registered on the window object for the GeminiXResponseChunkEvent.

Prop Type
response string
isChat boolean

PluginSendMessageOptions

Prop Type Description
images GeminiXImage[] List of image URIs to be given to the model.
streamResponse boolean Whether to stream the response from the model before the final response is received. If true, then event listeners registered on the window object for the GeminiXResponseChunkEvent will be called with partial responses until the final response is received. The final response will be the full model response text. Default is false.

GeminiXImage

An image to be given to the model specified by a URI. The mimeType is optional and will attempt to be inferred from the URI if not specified.

Prop Type
uri string
mimeType string

GeminiXResponseCount

Model response data passed to back to countTokens and countChatTokens functions.

Prop Type
count number
isChat boolean

PluginCountTokensOptions

Prop Type Description
images GeminiXImage[] List of image images to be given to the model.

PluginChatHistoryItem

Prop Type Description
isUser boolean Whether the message is from the user or the model.
text string The text of the message.
images GeminiXImage[] List of images to be given to the model.

PluginCountChatTokensOptions

Prop Type Description
inputText string User input text to be given to the model.
images GeminiXImage[] List of images to be given to the model.

ModelChatHistoryItem

A chat history item to be passed to initChat function.

Prop Type Description
isUser boolean Whether the message is from the user or the model.
parts ModelChatHistoryPart[] The parts of the message.

ModelChatHistoryPart

A chat history content part to be passed to initChat function.

Prop Type Description
type string The type of the part.
content string The content of the part.

Enums

SafetySettingLevel

Members Value
NONE "NONE"
ONLY_HIGH "ONLY_HIGH"
MEDIUM_AND_ABOVE "MEDIUM_AND_ABOVE"
LOW_AND_ABOVE "LOW_AND_ABOVE"
UNSPECIFIED "UNSPECIFIED"