/farfalle

🔍 AI search engine - self-host with local or cloud LLMs

Primary LanguageTypeScriptApache License 2.0Apache-2.0

Farfalle

Open-source AI-powered search engine. (Perplexity Clone)

Run your local LLM (llama3, gemma, mistral, phi3) or use cloud models (Groq/Llama3, OpenAI/gpt4-o)

Demo answering questions with llama3 on my M1 Macbook Pro:

local-demo.mp4

Please feel free to contact me on Twitter or create an issue if you have any questions.

💻 Live Demo

farfalle.dev (Cloud models only)

📖 Overview

🛣️ Roadmap

  • Add support for local LLMs through Ollama
  • Docker deployment setup
  • Add support for searxng. Eliminates the need for external dependencies.
  • Integrate with LiteLLM

🛠️ Tech Stack

Features

  • Search with multiple search providers (Tavily, Searxng)
  • Answer questions with cloud models (OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3)
  • Answer questions with local models (llama3, mistral, gemma, phi3)

🏃🏿‍♂️ Getting Started Locally

Prerequisites

  • Docker
  • Ollama (If running local models)
    • Download any of the supported models: llama3, mistral, gemma, phi3
    • Start ollama server ollama serve

Get API Keys

1. Clone the Repo

git clone git@github.com:rashadphz/farfalle.git
cd farfalle

2. Add Environment Variables

touch .env

Add the following variables to the .env file:

Search Provider

You can use Tavily or Searxng as the search provider.

Tavily (Requires API Key)

TAVILY_API_KEY=...
SEARCH_PROVIDER=tavily

Searxng (No API Key Required)

SEARCH_PROVIDER=searxng

Optional

# Cloud Models
OPENAI_API_KEY=...
GROQ_API_KEY=...

3. Run Containers

This requires Docker Compose version 2.22.0 or later.

docker-compose -f docker-compose.dev.yaml up -d

Visit http://localhost:3000 to view the app.

For custom setup instructions, see custom-setup-instructions.md

🚀 Deploy

Backend

Deploy to Render

After the backend is deployed, copy the web service URL to your clipboard. It should look something like: https://some-service-name.onrender.com.

Frontend

Use the copied backend URL in the NEXT_PUBLIC_API_URL environment variable when deploying with Vercel.

Deploy with Vercel

And you're done! 🥳

Use Farfalle as a Search Engine

To use Farfalle as your default search engine, follow these steps:

  1. Visit the settings of your browser
  2. Go to 'Search Engines'
  3. Create a new search engine entry using this URL: http://localhost:3000/?q=%s.
  4. Add the search engine.