smheidrich/py-json-stream-rs-tokenizer

Crashes when iterating over JSON array without external stop condition

smheidrich opened this issue · 1 comments

Like in #17 but [1] instead of 1:

>>> r = RustTokenizer(StringIO('[1]'))
>>> [token for token in r]
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 1, in <listcomp>
ValueError: Unexpected end of stream

Again doesn't seem to happen in practical usage by json-stream but should be fixed so we can e.g. run tests on it in isolation.

Fixed in #19