Graphics Card Usage zero
arunabh1904 opened this issue · 0 comments
arunabh1904 commented
Hi. Thank you for this implementation. I am trying to get this implemented on my own dataset for a 10 class semantic segmentation problem. I have managed to train it for 10k iterations and it gives a mean IOU of 91% and a Pixel wise accuracy of about 97% but it took over 2 days to reach 10k iterations. It doesnt seem to use the graphics card at all. I have an Nvidia 1080Ti that I want it to train on. Since the model doesnt have Multi GPU support yet, I am using just one Graphics card. I am not sure why its not using the graphics card. I am using tensorflow 1.8. Could you help me out? Thank you.