An AI-powered search engine with a generative UI.
Please note that there are differences between this repository and the official website morphic.sh. The official website is a fork of this repository with additional features such as authentication, which are necessary for providing the service online. The core source code of Morphic resides in this repository, and it's designed to be easily built and deployed. When using Morphic, please keep in mind the different roles of the repository and the website.
- 🧱 Stack
- 🚀 Quickstart
- 🌐 Deploy
- ✅ Verified models
- Enable specifying the model to use (only writer agent)
- Implement chat history functionality
- Develop features for sharing results
- Add video support for search functionality
- Implement Retrieval-Augmented Generation (RAG) support
- Introduce tool support for enhanced productivity
- Expand Generative UI capabilities
- App framework: Next.js
- Text streaming / Generative UI: Vercel AI SDK
- Generative Model: OpenAI
- Search API: Tavily AI
- Component library: shadcn/ui
- Headless component primitives: Radix UI
- Styling: Tailwind CSS
Fork the repo to your Github account, then run the following command to clone the repo:
git clone git@github.com:[YOUR_GITHUB_ACCOUNT]/morphic.git
cd morphic
bun i
cp .env.local.example .env.local
Your .env.local file should look like this:
# Used to set the base URL path for OpenAI API requests.
# If you need to set a BASE URL, uncomment and set the following:
# OPENAI_API_BASE=
# Used to set the model for OpenAI API requests.
# If not set, the default is gpt-4-turbo.
# OPENAI_API_MODEL='gpt-4-turbo'
# OpenAI API key retrieved here: https://platform.openai.com/api-keys
OPENAI_API_KEY=[YOUR_OPENAI_API_KEY]
# Tavily API Key retrieved here: https://app.tavily.com/home
TAVILY_API_KEY=[YOUR_TAVILY_API_KEY]
# Only writers can set a specific model. It must be compatible with the OpenAI API.
# USE_SPECIFIC_API_FOR_WRITER=true
# SPECIFIC_API_BASE=
# SPECIFIC_API_KEY=
# SPECIFIC_API_MODEL=
Note: This project focuses on Generative UI and requires complex output from LLMs. Currently, it's assumed that the official OpenAI models will be used. Although it's possible to set up other models, if you use an OpenAI-compatible model, but we don't guarantee that it'll work.
bun dev
You can now visit http://localhost:3000.
Host your own live version of Morphic with Vercel.
List of verified models that can be specified to writers.
- Groq
- LLaMA3 8b
- LLaMA3 70b