smheidrich/py-json-stream-rs-tokenizer

Integer type limitations should be fixed or documented

smheidrich opened this issue · 0 comments

As pointed out in daggaz/json-stream#24, a ValueError is raised if integers larger than 263-1 are encountered:

>>> from json_stream_rs_tokenizer import load
>>> from io import StringIO
>>> list(load(StringIO(f"[{2**63-1}]"))) # works
>>> list(load(StringIO(f"[{2**63}]"))) # raises:
...
ValueError: Error while parsing at index 20: PyErr { type: <class 'ValueError'>, value: ValueError('number too large to fit in target type'), traceback: None }

This should be either fixed (if it's relatively simple to fix and doesn't impact performance too much) or documented as a limitation.