Aedial/novelai-api

generate_image return "novelai_api.NovelAIError.NovelAIError: 200 - OK"

sifue opened this issue · 8 comments

sifue commented

I called generate_image with the following code and got a strange error novelai_api.NovelAIError.NovelAIError: 200 - OK.

Traceback:

Traceback (most recent call last):
  File "/root/opt/app.py", line 82, in message_img
    run(genarate(prompt))
  File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
    return future.result()
  File "/root/opt/app.py", line 55, in genarate
    async for img in api.high_level.generate_image(prompt, ImageModel.Anime_Curated, preset):
  File "/usr/local/lib/python3.9/site-packages/novelai_api/_high_level.py", line 355, in generate_image
    async for e in self._parent.low_level.generate_image(prompt, model, settings):
  File "/usr/local/lib/python3.9/site-packages/novelai_api/_low_level.py", line 657, in generate_image
    self._treat_response_object(rsp, content, 201)
  File "/usr/local/lib/python3.9/site-packages/novelai_api/_low_level.py", line 48, in _treat_response_object
    raise NovelAIError(rsp.status, str(rsp.reason))
novelai_api.NovelAIError.NovelAIError: 200 - OK

Code:

async with API() as api_handler:
     api = api_handler.api
     preset = ImagePreset()
     preset["n_samples"] = 1
     preset["resolution"] = (512, 512)
     preset["quality_toggle"] = False
     async for img in api.high_level.generate_image(prompt, ImageModel.Anime_Curated, preset):
         with open(GENERATED_FILEPATH, "wb") as f:
             f.write(img)

The entire code is at the following URL
https://github.com/sifue/novelai-slackbot/tree/02419d7d735257294478483753486669620cc384

Same here. This is definitely new since yesterday afternoon (UK time), so is presumably a result of a change at the NovelAI end.

The code which raises the exception is itself called by this bit in _low_level.py:

            # FIXME: check back when normalized
            if "api2" in self._parent.BASE_ADDRESS:
                self._treat_response_object(rsp, content, 200)
            else:
                self._treat_response_object(rsp, content, 201)

I guess from the "FIXME" comment that some instability here was foreseen!

I to am experiencing an issue with generate image. Mine comes back with ''utf-8' codec can't decode byte 0xb4 in position 10: invalid start byte' and I traced the problem to _treat_response_stream in low level after it is called by _request. I should also add I am on the older version of the code before the recent updates. I have been updating code on ImagePreset and _high_level to try and keep up with the updates to the novel api on my own but I have not touched _low_level and my issue happened around the same time as above persons

I actually had the same issue as clharper42 originally. Then I updated to the latest version of this repository, and got the same error as sifue (OP).

I actually had the same issue as clharper42 originally. Then I updated to the latest version of this repository, and got the same error as sifue (OP).

Thanks for the info. Once this issue gets fixed I will update to the newest version.

I have a simple fix. Not saying I fully understand the issue, but it looks like NovelAI have changed the format they return the PNG file in, and also changed the status code they return.

In _high_level.py, change line 360 from:

yield base64.b64decode(e)

to:

yield e.

Then in _low_level.py, change line 39 from:

if rsp.status == status:

to:

if rsp.status == status or rsp.status == 200 or rsp.status == 201:

(this is a bit of a hack as I'm not quite sure how to determine what status to expect).

Please note though the other Issue I've just created. "Undesired content" is no longer set properly, so any images you make are likely to be a bit rubbish.

Have identified a fix for the Undesired Content issue as well. Please see the other Issue I've opened. It's a one-line fix.

Aedial commented

Fixed by 1b2c7ba
It had indeed been foreseen, but the normalization was not communicated.