Lightning-Universe/lightning-flash

ImportError: cannot import name 'WarningCache' from 'pytorch_lightning.utilities.warnings'

r-matsuzaka opened this issue ยท 2 comments

๐Ÿ› Bug

I just want to run tutorial here.

Environment

  • Ubuntu Desktop 22.04 LTS
  • GPU RTX2080
  • Driver nvidia-driver-510

To Reproduce

  • Dockerfile
FROM nvidia/cuda:11.6.0-cudnn8-devel-ubuntu20.04

# https://serverfault.com/questions/683605/docker-container-time-timezone-will-not-reflect-changes
ENV TZ=Asia/Tokyo
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone

RUN apt-get update && apt-get install -y python3 python3-pip \
&& apt-get install ffmpeg libsm6 libxext6  -y

RUN pip3 install --no-cache-dir \
   jupyterlab \
   lightning-flash[image] \
   icedata \
   icevision
  • docker-compose.yaml
version: "3"
services:
  notebook:
    build: .
    volumes:
      - ".:/home/work"
      - ".jupyter:/root/.jupyter"
    ports:
      - "7777:7777"
    tty: true
    environment:
      - JUPYTER_ENABLE_LAB=yes
      - NVIDIA_VISIBLE_DEVICES=all
      - NVIDIA_DRIVER_CAPABILITIES=all
    deploy:
       resources:
         reservations:
           devices:
             - capabilities:
               - gpu
    command: jupyter lab --ip=0.0.0.0 --port=7777 --allow-root --no-browser --NotebookApp.token=''

Then, run

docker compose up

Connect to localhost:7777 and open up jupyter.

Then, run

from functools import partial

import flash
from flash.core.utilities.imports import example_requires
from flash.image import InstanceSegmentation, InstanceSegmentationData

example_requires("image")

import icedata  # noqa: E402

I got the error

ImportErrorTraceback (most recent call last)
Cell In [2], line 5
3 import flash
4 from flash.core.utilities.imports import example_requires
----> 5 from flash.image import InstanceSegmentation, InstanceSegmentationData
7 example_requires("image")
9 import icedata # noqa: E402

File /usr/local/lib/python3.8/dist-packages/flash/image/init.py:1
----> 1 from flash.image.classification import ( # noqa: F401
2 ImageClassificationData,
3 ImageClassificationInputTransform,
4 ImageClassifier,
5 )
6 from flash.image.classification.backbones import IMAGE_CLASSIFIER_BACKBONES # noqa: F401
7 from flash.image.detection.data import ObjectDetectionData # noqa: F401

File /usr/local/lib/python3.8/dist-packages/flash/image/classification/init.py:2
1 from flash.image.classification.data import ImageClassificationData, ImageClassificationInputTransform # noqa: F401
----> 2 from flash.image.classification.model import ImageClassifier # noqa: F401

File /usr/local/lib/python3.8/dist-packages/flash/image/classification/model.py:33
25 from flash.core.utilities.imports import requires
26 from flash.core.utilities.types import (
27 INPUT_TRANSFORM_TYPE,
28 LOSS_FN_TYPE,
(...)
31 OPTIMIZER_TYPE,
32 )
---> 33 from flash.image.classification.adapters import TRAINING_STRATEGIES
34 from flash.image.classification.backbones import IMAGE_CLASSIFIER_BACKBONES
35 from flash.image.classification.heads import IMAGE_CLASSIFIER_HEADS

File /usr/local/lib/python3.8/dist-packages/flash/image/classification/adapters.py:26
24 from pytorch_lightning.trainer.states import TrainerFn
25 from pytorch_lightning.utilities.exceptions import MisconfigurationException
---> 26 from pytorch_lightning.utilities.warnings import WarningCache
27 from torch import nn, Tensor
28 from torch.utils.data import DataLoader, IterableDataset, Sampler

ImportError: cannot import name 'WarningCache' from 'pytorch_lightning.utilities.warnings' (/usr/local/lib/python3.8/dist-packages/pytorch_lightning/utilities/warnings.py)

Borda commented

Can you please list what PL version are you using?
Also, consider use as https://hub.docker.com/r/pytorchlightning/pytorch_lightning base image

@Borda
Thank you very much for suggesting using pytorchlightning/pytorch_lightning image! I did not know about that.
I could run the tutorial with no error using that docker image.
It worked for me with single GPU and CPU. But I got stacked with multi GPUs... This shall be different topic. So I might ask about it in the future.
Anyway thank you very much!