Pinned issues
Issues
- 0
Unsupported: '^' from integregular when string has pattern in json schema
#931 opened by ekagra-ranjan - 4
For structuring JSON, I find the numbers (float/integers) are problematic with a consistent pattern of repetitive zeros
#847 opened by timothylimyl - 3
`llamacpp` tests failing in `main`
#922 opened by lapp0 - 0
Historical Benchmark Performance Dashboard
#928 opened by lapp0 - 0
Add link to TGI in documentation
#927 opened by rlouf - 1
Translate Pydantic models into a regular expression that accept the corresponding YAML
#923 opened by rlouf - 7
Are we able to structure JSON output into a single line with just one whitespace?
#908 opened by timothylimyl - 11
`outlines.generate.choice` generates tkens other than provided choices - special tokens being added to tokenizer incorrectly?
#893 opened by aaronsnoswell - 0
Support Windows 7
#924 opened by castdrink - 0
`json_schema.py`: If schema has an array with no `items` key, the pattern is invalid
#913 opened by lapp0 - 1
- 1
`mlx` library integration (via `mlx-lm`)
#918 opened by pngwn - 4
syntax error when using Optional[Tuple[int,int]]]
#905 opened by hugocool - 0
- 3
- 0
Add a progress bar for compilation
#810 opened by rlouf - 1
- 4
"additionalProperties" should not be a required field
#844 opened by sgsdxzy - 0
- 3
Rename json.py in outlines/generate
#879 opened by sorgfresser - 2
Pydantic example from README crashes
#841 opened by crclark - 0
Cannot generate parentheses in JSON strings
#838 opened by posionus - 2
Check benchmarks in CI
#883 opened by brandonwillard - 9
Encountering `RuntimeError: Cannot convert token ` �` (29333) to bytes: �` for some model vocabularies when using llama.cpp
#820 opened by abetlen - 1
- 1
Outlines generate succeeds but stream fails in following llama 3 stop_at constraint
#896 opened by isamu-isozaki - 2
KeyError in `BetterFSM::FSMInfo` when input FSM `alphabet` contains UTF-8 characters that ends with `\xb8\x80`
#833 opened by m0g1cian - 0
pip uninstall doesn't flush the FSM cache
#894 opened by chrsbats - 0
Automate tweeting new releases
#891 opened by rlouf - 3
Logits Processors `Guide` integration will be buggy when `len(tokens) > 1` in a `Write` instruction
#855 opened by br3no - 5
Endless generation bug popped up during migration to `Guide` in vLLM integration
#856 opened by br3no - 3
Disable regex caching in tests
#853 opened by brandonwillard - 0
Stop using `is_final_state` and `final_states`
#885 opened by lapp0 - 8
generate.json() gives ValidationError when run with mistral-7b-instruct-v0.2.Q6_K.gguf
#837 opened by DorotaBjoorn - 0
- 1
Yaml Grammar
#871 opened by eitanturok - 1
- 0
Update logits array in-place
#859 opened by brandonwillard - 1
outlines.fsm.guide circular reference
#843 opened by pj-ml - 2
- 2
Can't load a local model with llama.cpp "repo id must be a string" "model path does not exist"
#834 opened by julian-passebecq - 1
- 0
Remove `vocabulary` attribute of `RegexGuide`
#828 opened by rlouf - 0
[Bug]: Disk I/O Error when using tools due to shared outlines cache database
#827 opened by AaronFriel - 1
Make `torch` dependency optional
#813 opened by rlouf - 1
Add a cookbook on how to send batch requests with vLLM
#811 opened by rlouf - 0
Documentation of interleaving support in generation / in filling / transformers support?
#812 opened by randomcodelookup - 0
Add an integration with HF's pipelines
#809 opened by rlouf - 0
Update the `mamba` integration
#808 opened by rlouf - 0
Update the `exllamav2` integration
#807 opened by rlouf