Capacitor plugin to invoke Google's Gemini AI models on Android, iOS and Web.
npm install capacitor-plugin-gemini-x
npx cap sync
initModel(...)
sendMessage(...)
countTokens(...)
initChat(...)
sendChatMessage(...)
countChatTokens(...)
getChatHistory()
- Interfaces
- Enums
initModel(args: { params: ModelParams; }) => Promise<void>
Initialize the model with the given parameters.
Param | Type |
---|---|
args |
{ params: ModelParams; } |
sendMessage(args: { inputText: string; options?: PluginSendMessageOptions; }) => Promise<GeminiXResponseChunk>
Send a message to the model and return the response.
Param | Type |
---|---|
args |
{ inputText: string; options?: PluginSendMessageOptions; } |
Returns: Promise<GeminiXResponseChunk>
countTokens(args: { inputText: string; options?: PluginCountTokensOptions; }) => Promise<GeminiXResponseCount>
Count the number of tokens in the given input text.
Param | Type |
---|---|
args |
{ inputText: string; options?: PluginCountTokensOptions; } |
Returns: Promise<GeminiXResponseCount>
initChat(args: { chatHistory?: PluginChatHistoryItem[]; }) => Promise<void>
Initialize a chat session with optional chat history.
Param | Type |
---|---|
args |
{ chatHistory?: PluginChatHistoryItem[]; } |
sendChatMessage(args: { inputText: string; options?: PluginSendMessageOptions; }) => Promise<GeminiXResponseChunk>
Send a message to the model for the current chat session and return the response.
Param | Type |
---|---|
args |
{ inputText: string; options?: PluginSendMessageOptions; } |
Returns: Promise<GeminiXResponseChunk>
countChatTokens(args: { options?: PluginCountChatTokensOptions; }) => Promise<GeminiXResponseCount>
Count the number of tokens for the current chat session with optional input text and images.
Param | Type |
---|---|
args |
{ options?: PluginCountChatTokensOptions; } |
Returns: Promise<GeminiXResponseCount>
getChatHistory() => Promise<ModelChatHistoryItem[]>
Get the chat history for the current chat session.
Returns: Promise<ModelChatHistoryItem[]>
Model parameters to be passed to initModel
function.
Prop | Type | Description |
---|---|---|
modelName |
string |
The name of the model to be used. |
apiKey |
string |
The API key to be used. |
temperature |
number |
The temperature to be used for generation. |
topK |
number |
The topK to be used for generation. |
topP |
number |
The topP to be used for generation. |
maxOutputTokens |
number |
The maximum number of tokens to be generated. |
stopSequences |
string[] |
The stop sequences to be used for generation. |
safetySettings |
SafetySettings |
The safety settings to be used for generation. |
Safety settings to be passed to initModel
function.
Prop | Type |
---|---|
[SafetySettingHarmCategory.HARASSMENT] |
SafetySettingLevel |
[SafetySettingHarmCategory.HATE_SPEECH] |
SafetySettingLevel |
[SafetySettingHarmCategory.SEXUALLY_EXPLICIT] |
SafetySettingLevel |
[SafetySettingHarmCategory.DANGEROUS_CONTENT] |
SafetySettingLevel |
[SafetySettingHarmCategory.UNSPECIFIED] |
SafetySettingLevel |
Model response data passed to back to sendMessage
and sendChatMessage
functions.
Also passed to event handlers registered on the window
object for the GeminiXResponseChunkEvent
.
Prop | Type |
---|---|
response |
string |
isChat |
boolean |
Prop | Type | Description |
---|---|---|
images |
GeminiXImage[] |
List of image URIs to be given to the model. |
streamResponse |
boolean |
Whether to stream the response from the model before the final response is received. If true , then event listeners registered on the window object for the GeminiXResponseChunkEvent will be called with partial responses until the final response is received. The final response will be the full model response text. Default is false . |
An image to be given to the model specified by a URI. The mimeType is optional and will attempt to be inferred from the URI if not specified.
Prop | Type |
---|---|
uri |
string |
mimeType |
string |
Model response data passed to back to countTokens
and countChatTokens
functions.
Prop | Type |
---|---|
count |
number |
isChat |
boolean |
Prop | Type | Description |
---|---|---|
images |
GeminiXImage[] |
List of image images to be given to the model. |
Prop | Type | Description |
---|---|---|
isUser |
boolean |
Whether the message is from the user or the model. |
text |
string |
The text of the message. |
images |
GeminiXImage[] |
List of images to be given to the model. |
Prop | Type | Description |
---|---|---|
inputText |
string |
User input text to be given to the model. |
images |
GeminiXImage[] |
List of images to be given to the model. |
A chat history item to be passed to initChat
function.
Prop | Type | Description |
---|---|---|
isUser |
boolean |
Whether the message is from the user or the model. |
parts |
ModelChatHistoryPart[] |
The parts of the message. |
A chat history content part to be passed to initChat
function.
Prop | Type | Description |
---|---|---|
type |
string |
The type of the part. |
content |
string |
The content of the part. |
Members | Value |
---|---|
NONE |
"NONE" |
ONLY_HIGH |
"ONLY_HIGH" |
MEDIUM_AND_ABOVE |
"MEDIUM_AND_ABOVE" |
LOW_AND_ABOVE |
"LOW_AND_ABOVE" |
UNSPECIFIED |
"UNSPECIFIED" |