Issues
- 0
Add preproc_fn to Extractor.apply
#74 opened by amaiya - 0
PDF OCR support
#75 opened by amaiya - 0
- 0
Remove BOS token from default prompt
#67 opened by amaiya - 1
- 0
Add clean function to Extractor.apply
#69 opened by amaiya - 0
Remove call to persist
#68 opened by amaiya - 0
Few-Shot classification pipeline
#66 opened by amaiya - 0
add check for partially download files
#49 opened by amaiya - 0
change default model to Mistral
#65 opened by amaiya - 0
information extraction pipeline
#64 opened by amaiya - 0
experimental support for Azure OpenAI
#63 opened by amaiya - 0
allow installation of onprem without llama-cpp-python for easier use with LLMs served through REST APIs
#62 opened by amaiya - 0
Use OnPrem.LLM with OpenAI-compatible REST APIs
#61 opened by amaiya - 0
Issue warning instead of halting if encountering an error when loading files during ingest
#60 opened by amaiya - 0
- 0
Add ignore_fn argument to LLM.ingest
#58 opened by amaiya - 2
Curious
#57 opened by agentsimon - 1
- 1
offload_kqv not properly set
#50 opened by amaiya - 0
support OpenAI models
#55 opened by amaiya - 0
Accept extra **kwargs in prompt and pass to model
#54 opened by amaiya - 0
Add `stop` parameter to `LLM.prompt` method
#53 opened by amaiya - 0
Use Zephyr-7B as default model in Web app
#52 opened by amaiya - 1
Add prompt_template argument to LLM
#51 opened by amaiya - 0
add pipeline module
#35 opened by amaiya - 0
add guardrails module
#34 opened by amaiya - 1
CPU limits
#48 opened by lystrata - 0
add python-pptx as dependency
#44 opened by amaiya - 0
have ingest skip ~$ files created by Windows
#45 opened by amaiya - 0
add prompt template from YAML to ask in Web app
#47 opened by amaiya - 0
add progress bar for ingest
#46 opened by amaiya - 0
"No module named 'docx' error
#43 opened by amaiya - 2
Support for LlamaIndex
#39 opened by nawagner - 2
Core dumped / segmentation fault
#41 opened by lysa324 - 2
Support for Mistral 7B Model
#40 opened by rabilrbl - 3
Number of tokens
#38 opened by lysa324 - 3
- 0
support for custom metadata in vectorstore
#36 opened by amaiya - 0
- 0
include `prompt_template` in YAML for Web app
#32 opened by amaiya - 0
Change `LLM.ask` to return dictionary with keys: `answer`, `source_documents`, and `question`
#31 opened by amaiya - 0
load llm in constructor
#30 opened by amaiya - 0
round scores in web app
#29 opened by amaiya - 0
include hyperlinks to sources
#28 opened by amaiya - 0
- 3
- 0
use `CallbackManager`
#24 opened by amaiya - 0
- 0
support for custom callbacks
#21 opened by amaiya