Issues
- 0
- 5
React dependency installation error
#466 opened by Superskyyy - 1
Typescript multiagent templates are missing code
#474 opened by leehuwuj - 0
Add support for LlamaCloud in Reflex's template.
#468 opened by leehuwuj - 0
Bump e2b_code_interpreter to newer version
#388 opened by leehuwuj - 0
Release reusable chat components
#382 opened by marcusschiesser - 5
CORS is missing in the newly generated frontends
#444 opened by KenjiPcx - 3
- 1
- 0
Use llama-report if LlamaCloud is selected
#465 opened by marcusschiesser - 0
- 0
Update to the NEXT_QUESTION_PROMPT
#460 opened by slyapustin - 1
Optimize generated code
#398 opened by marcusschiesser - 0
[Use Case] Add resume matching
#437 opened by marcusschiesser - 1
Adding Multi-Modal FunctionCallingAgent
#452 opened by marcusschiesser - 1
Support Multimodal RAG from LlamaCloud
#371 opened by marcusschiesser - 0
Use auto_route mode for query engine tool
#434 opened by marcusschiesser - 4
Question: How to "Using your data"
#448 opened by asindl - 2
- 1
Add option to deploy to Railway
#429 opened by marcusschiesser - 4
Dependency problem when using azure openai models
#435 opened by kris7ian - 1
Pydantic issue with new templates crashing backend
#441 opened by KenjiPcx - 9
- 0
Make Python templates full-stack
#421 opened by marcusschiesser - 1
- 4
duplicate messages in the query
#428 opened by mcavdar - 7
Use reflex for Python full-stack
#375 opened by marcusschiesser - 1
Add support for NextJS RSC
#345 opened by marcusschiesser - 1
[Use-Case] Add Form Filling
#362 opened by marcusschiesser - 9
LLMAgent doesn't support Ollama (needs LITS update)
#401 opened by joshuacox - 0
- 4
Error: Maximum update depth exceeded.
#397 opened by cognitiveRobot - 11
extractText is not a function
#384 opened by Robing - 0
Use chat-ui in NextJS template
#400 opened by marcusschiesser - 0
- 1
- 0
Highlight text of source node in document
#380 opened by marcusschiesser - 0
Use webcontainers for code artifacts
#374 opened by marcusschiesser - 0
bump llamaindex to latest
#330 opened by marcusschiesser - 0
Add simple mode
#360 opened by marcusschiesser - 11
Node Sources Missing in UI and RAG Pipeline Unable to Read Vector Database in Release 0.2.11
#326 opened by klei30 - 1
npm run generate fails with chromadb
#341 opened by tanmaybhardwaj - 0
- 0
- 0
Show non-code artifacts in artifact view
#344 opened by marcusschiesser - 1
Dependency Conflict: llama-index-callbacks-arize-phoenix (^0.1.6) incompatible with llama-index (0.11.6)
#338 opened by ChristenJacquottet - 1
- 1
I had this issue as well when i deployed on vercel. I had to update my Next config file to automatically trace the dependency for all api routes
#334 opened by mrsamirr - 1
- 0
Add type-checks for generated TS code
#316 opened by marcusschiesser