gradio-app/gradio

Cannot cache examples if the Plot output is large (Gradio 4.26+)

Opened this issue · 0 comments

Describe the bug

I'd like to cache a plot result. But when the plot size is large (~5MB), there will be an error to load it.

It seems that the result is saved to the log.csv, and loading the file encounters _csv.Error: field larger than field limit (131072).

This error is found in 4.26, 4.29, 4.31, but not found in 3.41. I found that in the older version, the plot result was saved in JSON file.
So this probably is a bug.

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

import plotly.graph_objects as go
import numpy as np

import gradio as gr

def plot_forecast(final_year):
    fig = go.Figure()

    xv = np.random.rand(100000)
    yv = np.random.rand(100000)
    zv = np.random.rand(100000)

    fig.add_trace(go.Scatter3d(
        x=xv, y=yv, z=zv, showlegend=False,
        mode='text', text=str(final_year)
    ))
            
    fig.update_layout(
        height=720
    )
    return fig


demo = gr.Blocks(title='Plot Issue')

with demo:
    vis_output = gr.Plot(label="forecast")
    year = gr.Slider(2020, 2100, value=2021, step=1)

    gr.Examples(
        examples=[
            [2040],
            [2050]
        ],
        inputs=[year],
        fn=plot_forecast,
        outputs=[vis_output],
        cache_examples='lazy',
        label='Examples'
    )

if __name__ == "__main__":
    demo.launch()

Screenshot

No response

Logs

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 566, in process_events
    response = await route_utils.call_process_api(
  File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 261, in call_process_api
    output = await app.get_blocks().process_api(
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1786, in process_api
    result = await self.call_function(
  File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1350, in call_function
    prediction = await utils.async_iteration(iterator)
  File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 583, in async_iteration
    return await iterator.__anext__()
  File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 576, in __anext__
    return await anyio.to_thread.run_sync(
  File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2144, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
  File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 559, in run_sync_iterator_async
    return next(iterator)
  File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 742, in gen_wrapper
    response = next(iterator)
  File "/usr/local/lib/python3.10/site-packages/gradio/helpers.py", line 416, in sync_lazy_cache
    self.cache_logger.flag(output)
  File "/usr/local/lib/python3.10/site-packages/gradio/flagging.py", line 189, in flag
    line_count = len(list(csv.reader(csvfile))) - 1
_csv.Error: field larger than field limit (131072)

System Info

Issue found in 4.26.0, 4.29.0, 4.31.0
Issue not found in 3.41.2

Severity

Blocking usage of gradio