Vue Gemini Chat is a web application that allows users to chat with an AI-powered bot using the Google's Gemini API. It is built with Vue.js, Pinia for state management, and Tailwind CSS for styling. The backend is powered by a Cloudflare Worker, which handles user authentication and communication with the Gemini API.
Follow me @elz0xn
- User registration and login
- Chat interface for conversing with the Gemini AI bot
- Markdown support for chat messages
- Responsive design using Tailwind CSS with Daiysui
- Using Cloudflare Workers and Cloudflare D1 for Storasge
-
Clone the repository:
git clone git@github.com:elsodev/gemini-vue-chat-cloudflare.git
-
Navigate to the project directory:
cd gemini-vue-chat-cloudflare
-
Install Dependencies at root directory for Vue and at
/cloudflare
for CF workers.npm install
oryarn
-
Set up the Cloudflare Worker:
-
/cloudflare/src/index.js
is the main functions of the worker -
Create a
/cloudflare/wrangler.toml
, this is the configurations for your worker, env vars and D1 Database -
/cloudflare/schema.sql
is the Database Schema to run for D1- To run for local D1, run
yarn run wrangler d1 execute vue_chat --file=./schema.sql
- To run for local D1, run
-
In
cloudflare/wrangler.toml
#:schema node_modules/wrangler/config-schema.json name = "gemini-chat-worker" main = "src/index.js" compatibility_date = "2024-05-29" compatibility_flags = ["nodejs_compat"] [vars] GEMINI_API_KEY = "REPLACE_YOUR_GEMINI_KEY" [[d1_databases]] binding = "DB" # i.e. available in your Worker on env.DB database_name = "vue_chat" database_id = "REPLACE_YOUR_D1_KEY"
You need to start Vue App and also run local Cloudflare workers(or you can deploy to your Cloudflare account)
-
Start Cloudflare workers locally:
yarn run wrangler dev --local
-
You will be shown your workers local address copy that and replace
VUE_APP_CLOUDFLARE_WORKER_URL
value of.env
at root directory (Your Vue Env) -
Start the development server at root of the project for Vue App:
npm run serve
-
You will be shown local address of your Vue App being served typically
http://localhost:8080
Contributions are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.
This project is licensed under the MIT License.