aws/sagemaker-huggingface-inference-toolkit

How do I use my own config.properties

aj2622 opened this issue · 2 comments

How can I use my own config.properties
with the aws deep learning inference images (eg: 763104351884.dkr.ecr.us-east-1.amazonaws.com/huggingface-pytorch-inference:1.10.2-transformers4.17.0-gpu-py38-cu113-ubuntu20.04) built on top on this tool kit

I was trying to do something like this https://pytorch.org/serve/configuration.html. But I am realizing that hugging face images don't use torch server. 🤦 and that these options might not be available

The Toolkit is based on MMS. To provide your own config.properties you have to derive the container. You can find the currently used config here: https://github.com/aws/deep-learning-containers/blob/master/huggingface/build_artifacts/inference/config.properties