codeslake/IFAN

Run Inference with CPU

ikhwan12 opened this issue · 2 comments

Hi @codeslake
Thank you for sharing great and interesting project.
I am testing your code with some blurred image input on Google colab and output great result.
One quick question, currently It is running on top of GPU show fast result the performance around 30 - 40 ms. Just curious, is it possible to run your model inference using CPU ?
Thank you very much.

Hi, @ikhwan12.

I've just updated the code for running on a CPU. Please pull the repo.

To test with CPU, try the following:
python run.py --mode IFAN --network IFAN --config config_IFAN --data DPDD --ckpt_abs_name ckpt/IFAN.pytorch --cpu --data_offset /data_offset --output_offset ./output

For training:

python -B run.py \
            --is_train \
            --mode IFAN \
            --config config_IFAN \
            --trainer trainer \
            --network IFAN \
            -b 2 \
            -th 8 \
            -dl \
            -ss \
            -cpu

@codeslake
Thank you for your prompt reply.
I'd like to pull and try it.

Thank you very much.