Tired of waiting on slow chat completions? Faster.chat is an open-source, blazingly fast chat UI powered by Groq and Llama 3.
demo.mp4
- 🏃♂️ Lightning-fast response times
- 🔧 Easy setup and configuration
- 🌐 Powered by Groq & Llama 3
-
Clone the repository:
git clone https://github.com/lucasastorian/faster cd faster
-
Copy the example environment file:
cp src/environments/environment.example.ts src/environments/environment.ts
-
Add your Groq API key to
src/environments/environment.ts
. -
Install the required dependencies:
npm install
-
Serve the application:
ng serve -o
The application will be available at
http://localhost:4200
.
We welcome contributions from the community! If you'd like to contribute to faster.chat, please follow these steps:
- Fork the repository.
- Create a new branch for your feature or bug fix.
- Make your changes and commit them with descriptive messages.
- Push your changes to your forked repository.
- Submit a pull request to the main repository.
Please ensure that your code adheres to our coding standards and includes appropriate tests.
faster.chat is released under the Apache 2.0 License.
- Groq for providing the high-performance cloud infrastructure.
- Meta AI for developing the Llama 3 language model.
- Angular for the powerful web development framework.
If you have any questions, suggestions, or feedback, please feel free to reach out to us at lucasastorian@gmail.com.
Happy chatting! 😊