roatienza/Deep-Learning-Experiments

How long does it take to run 10K iterations?

speedyray opened this issue · 10 comments

How long does it take to run 10K iterations on a 2.7 GHZ intel Core i5 processor Mac book?

I was at 150 iterations after so many hours? Is there a problem with code, system or I need a GPU?

Did you get your answer?

Nvidia Geforce 1060 GPU 6GB; I am able to get to 1000 iterations in about 2 minutes.
You will want to make sure you install tensorflow-gpu once you do get your gpu; otherwise, it will use the cpu still.

Is it the Nvidia GEFORCE GTX 1060 which costs $299? https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1060/
Please provide more details on how or a link to the project if you can
Thanks

Yeah, with 6GB of RAM. It mentions it's sold out atm. Amazon shows only a used option. I am not sure if they don't make it anymore or what.

Also, if it's helpful I am using an AMD Ryzen 7 2700X 8-core Processor.
While running the GAN training, I am at about 56% usage on GPU & 10% on CPU with about 8 GB of RAM being used.

Total Time: 52 minutes

I would prefer to be able to run 10, 000 iterations in about 2 - 5 minutes. Any processors out there that can do this in 2019? Please update. Thanks

Hi daryabiparva,
please kindly run the experiment again now so we all know, Sir.
google Colab is free to use, please. No need to say Sorry when you can use something for free and help others.

Best regards,
Thanks