An open source Claude Artifacts – generate small apps with one prompt. Powered by Llama 3 405B & Together.ai.
- Llama 3.1 405B from Meta for the LLM
- Together AI for LLM inference
- Sandpack for the code sandbox
- Next.js app router with Tailwind
- Helicone for observability
- Plausible for website analytics
- Clone the repo:
git clone https://github.com/Nutlope/llamacoder
- Create a
.env
file and add your Together AI API key:TOGETHER_API_KEY=
- Run
npm install
andnpm run dev
to install dependencies and run locally
- Look into a way to export/deploy the app in a single click. Can try to do it myself with a dynamic route + some hashing, or try to use Replit/Vercel
- Save previous versions so people can go back and forth between the generated ones
- Could be nice to show a "featured apps" route on the site on
/featured
. Have a/id/${prompt}
dynamic route that can display a bunch of nice example apps in the sandbox ready to go - Support more languages starting with Python, check out E2B
- Try finetuning a smaller model on good prompts from 405b or GPT-4/Claude
- Add dark mode to the site overall, nice design change
- Think about how to have 405B correct itself (sometimes it makes up imports)
- New route for updateCode that only sends the latest generated code + the modify request
- Fix bug where if a user edits the code, then does a change, it doesn't use the edited code
- Add rate limiting with redis upstash if the traffic gets too high
- Try to add a consistent component library like shadcn
- Apply code diffs directly instead of asking the model to generate the code from scratch
- Add the ability to upload things like a screenshot for it to start from that