Issues
- 3
- 0
promptlib is broken
#105 opened by renatolfc - 2
Support for phi3 - llama.cpp update
#99 opened by superchargez - 0
- 2
pytctrl backtracking is non-idempotent
#93 opened by matthai - 0
token length limit
#98 opened by mmoskal - 0
graceful mid_process() timeout handling
#94 opened by mmoskal - 0
add logLevel and setLogLevel in jsctrl
#77 opened by mmoskal - 0
guidance controller
#91 opened by mmoskal - 0
support native controllers
#90 opened by mmoskal - 1
Exception when no token generated
#88 opened by matthai - 2
confusing error for missing CUDA compute cap 8.0
#87 opened by vargonis - 2
- 2
- 3
JSON output doesn't conform to pattern
#74 opened by samrat - 0
better error on prompt len exceeded in rllm
#81 opened by mmoskal - 1
- 1
can't build pyctrl on WSL (windows 11)
#79 opened by superchargez - 1
duplicate tokens in tokenizers
#78 opened by mmoskal - 1
vLLM integration
#63 opened by emrekiciman - 2
Server crashes with `index out of bounds`
#75 opened by samrat - 3
add TokenSet.num_set/repr in jsctrl
#64 opened by mmoskal - 0
Question about FixedToken (insertion)?
#71 opened by JohnPeng47 - 1
Remove unused parameter of function `controllers/uppercase/src/main.rs - append(&self, state: usize, _byte: u8)`
#69 opened by Raboro - 0
[RFC] drop pre/post callback, only leave mid
#67 opened by mmoskal - 1
add samples to jsctrl readme
#33 opened by mmoskal - 0
have bump.sh update version field in cargo.toml
#54 opened by mmoskal - 2
file format aware mounted "file directory" virtualized repl agent / pseudomodel (vm/docker instance that exposes open ai interface and has access to all mounted (cloud/github / bitbucket/gitlab included)
#60 opened by darvin - 0
update comms.py for new interface
#43 opened by mmoskal - 1
- 0
split out common rllm from rllm-cuda
#48 opened by mmoskal - 2
better folder structure
#56 opened by mmoskal - 0
GPU acceleration on mac M1
#46 opened by mmoskal - 1
add more lr grammars
#58 opened by darvin - 0
allow passing binary data to controllers
#57 opened by mmoskal - 1
aicirt native Windows support
#42 opened by mmoskal - 0
return non 500 on "instantiate" timeout
#52 opened by mmoskal - 1
Warning(s) when starting rLLM-cpp
#50 opened by dluc - 0
llama CPU phi2 model wrong generation
#47 opened by mmoskal - 1
- 1
document interface between aicirt and LLM engine
#38 opened by mmoskal - 1
join post+pre into one RPC call to aicirt
#40 opened by mmoskal - 0
support for Python 3.8 in aici command line
#36 opened by mmoskal - 0
introduce new REST API for controller running
#39 opened by mmoskal - 0
allow JSON-encoded "files" to pyctrl/jsctrl
#41 opened by mmoskal - 0
integrate with llama.cpp
#34 opened by mmoskal - 0
rewrite promptlib to use pyctrl not declctrl
#37 opened by mmoskal - 0
don't import numpy in aici cmd line
#32 opened by mmoskal - 0
Allow tagging wasm modules
#29 opened by mmoskal - 0