bweigel/ml_at_awslambda_pydatabln2018

Model deployment on AWS Lambda using GPU ?

Opened this issue · 1 comments

Hey Benjamin,

I was just watching your Pydata tutorial video. Great tutorial, thanks. However my question is that my model at inference time on my own machine uses GPU. Is that also possible to do on AWS Lambda and how so ?

Thanks

GPU-based inference on Lambda is not supported afaik.
If you want GPU-based inference I suggest using AWS Sagemaker.