aws-samples/aws-lambda-docker-serverless-inference

pytorch-inference-docker-lambda return AssertionError using GPU

nelsontseng0704 opened this issue · 1 comments

Hi,

When I play around with the pytorch-inference-docker-lambda example, it can smoothly run in the Pytorch CPU version. But I try to add .to('cuda') to the model and received the image. I received the following error message.

{"errorMessage": "\nFound no NVIDIA driver on your system. Please check that you\nhave an NVIDIA GPU and installed a driver from\nhttp://www.nvidia.com/Download/index.aspx", "errorType": "AssertionError", "stackTrace": [" File \"/var/task/app.py\", line 25, in handler\n torch_image = scaled_img.unsqueeze(0).to('cuda')\n", " File \"/var/lang/lib/python3.8/site-packages/torch/cuda/__init__.py\", line 196, in _lazy_init\n _check_driver()\n", " File \"/var/lang/lib/python3.8/site-packages/torch/cuda/__init__.py\", line 98, in _check_driver\n raise AssertionError(\"\"\"\n"]}

Hello @nelsontseng0704 , AWS Lambda do not support GPU, so you can run this example only using CPU.