Welcome to com-chat, the AI suite for professionals that need function, form,
simplicity, and speed. Powered by the latest models from 12 vendors and
open-source servers, com-chat
offers best-in-class Chats,
Beams,
and Calls with AI personas,
visualizations, coding, drawing, side-by-side chatting, and more -- all wrapped in a polished UX.
Stay ahead of the curve with com-chat. 🚀 Pros & Devs love com-chat. 🤖
Or fork & run on Vercel
You can easily configure 100s of AI models in com-chat:
AI models | supported vendors |
---|---|
Opensource Servers | LocalAI (multimodal) · Ollama · Oobabooga |
Local Servers | LM Studio |
Multimodal services | Azure · Google Gemini · OpenAI |
Language services | Anthropic · Groq · Mistral · OpenRouter · Perplexity · Together AI |
Image services | Prodia (SDXL) |
Speech services | ElevenLabs (Voice synthesis / cloning) |
Add extra functionality with these integrations:
More | integrations |
---|---|
Web Browse | Browserless · Puppeteer-based |
Web Search | Google CSE |
Code Editors | CodePen · StackBlitz · JSFiddle |
Sharing | Paste.gg (Paste chats) |
Tracking | Helicone (LLM Observability) |
To download and run this Typescript/React/Next.js project locally, the only prerequisite is Node.js with the npm
package manager.
Clone this repo, install the dependencies (all local), and run the development server (which auto-watches the
files for changes):
git clone https://github.com/smart-window/com-chat.git
cd com-chat
npm install
npm run dev
# You will see something like:
#
# ▲ Next.js 14.1.0
# - Local: http://localhost:3000
# ✓ Ready in 2.6s
The development app will be running on http://localhost:3000
. Development builds have the advantage of not requiring
a build step, but can be slower than production builds. Also, development builds won't have timeout on edge functions.
The production build of the application is optimized for performance and is performed by the npm run build
command,
after installing the required dependencies.
# .. repeat the steps above up to `npm install`, then:
npm run build
next start --port 3000
The app will be running on the specified port, e.g. http://localhost:3000
.