This README highlights the dependencies and flow for building an image on the DigitalOcean marketplace.
Intro.md: is the DigitalOcean README currently found on NM's marketplace profile.
template.json: config file for configuring the droplet to build the image, Ubuntu installs/dependencies and which scripts are to be pushed into the image.
deepsparse.sh: is the script for installing deepsparse.
99-one-click: is the script that for populating text when the Droplet boots up.
90-cleanup.sh | 99-img-check.sh: DO scripts used for image check and compliance.
To install make and Packer on Ubuntu, you can follow the steps below:
Installing make:
-
Open a terminal on your Ubuntu system.
-
Run the following command to install
make:sudo apt update sudo apt install make
Installing Packer:
-
Open a terminal on your Ubuntu system.
-
Download the latest version of Packer for Linux by executing the following command:
wget https://releases.hashicorp.com/packer/<VERSION>/packer_<VERSION>_linux_amd64.zip
Replace
<VERSION>with the specific version number you want to install. You can check for the latest version by visiting the Packer releases page: https://releases.hashicorp.com/packer/ -
Install the unzip utility if it's not already installed:
sudo apt install unzip
-
Extract the Packer binary from the downloaded ZIP file:
unzip packer_<VERSION>_linux_amd64.zip
-
Move the extracted binary to the
/usr/local/bin/directory:sudo mv packer /usr/local/bin/
-
Verify that Packer is installed correctly by running the following command:
packer version
To create a DigitalOcean personal access token and set it to the DIGITALOCEAN_API_TOKEN environment variable, you can follow these steps:
-
Log in to your DigitalOcean account at https://cloud.digitalocean.com/login.
-
After logging in, click on your account avatar in the top right corner of the control panel and select "API" from the dropdown menu.
-
In the API section, click on the "Generate New Token" button.
-
Enter a name for your token in the "Token Name" field. You can choose any name that helps you identify the purpose of the token.
-
Choose the appropriate permissions based on the tasks you plan to perform. For working with Packer, you will need the "Write" and "Read" permissions for Droplets, Images, and Snapshots.
-
Once you have selected the desired permissions, click on the "Generate Token" button at the bottom of the page.
-
DigitalOcean will generate a new personal access token for you. Make sure to copy the token as it will not be displayed again for security reasons.
-
Open a terminal or command prompt on your local machine.
-
Set the
DIGITALOCEAN_API_TOKENenvironment variable by running the following command:export DIGITALOCEAN_API_TOKEN=your-access-token -
The
DIGITALOCEAN_API_TOKENenvironment variable is now set, and you can use it in your Packer configuration or any other scripts that interact with the DigitalOcean API.
This flow installs Doctl for WSL Ubuntu. For Ubuntu not installed on WSL, use this flow.
Afterwards, connect with DigitalOcean by passing in your Personal Access Token:
doctl auth init --access-token YOUR_API_TOKENPacker is a tool for creating images from a single source configuration. Using this Packer template reduces the entire process of creating, configuring, validating, and snapshotting a build Droplet to a single command:
Install Repo
git clone https://github.com/neuralmagic/deepsparse-digitalocean-image.git
cd deepsparse-digitalocean-imageInitialize DigitalOcean as a builder:
packer init ./config.pkr.hclNow build image:
packer build template.jsonThis command will build an image with DeepSparse, run a few health checks for marketplace integration which are found in the scripts directory and save the image as a snapshot onto your DO account.
Before starting make sure you have an SSH key with your DO account.
TIP: to find a list of SSH fingerprints run:
doctl compute ssh-key listRun the following command to get a list of snapshots in order to obtain the ID of the newly built image:
doctl compute snapshot listFinally, pass the Snapshot-ID of the DeepSparse image and the SSH fingerprint into the following command to create a Droplet using a compute optimized instance:
(FYI the region where the snapshot is saved, needs to be the same as the region where the droplet is created, which in this example was nyc3)
doctl compute droplet create deepsparse-droplet --image <SNAPSHOT-ID> --region nyc3 --size c-4-intel --ssh-keys <FINGERPRINT>After staging, SSH into Droplet.
TIP: To find the IP address of the droplet, run the following commmand:
doctl compute droplet listNow, pass the IP address into the following command:
ssh root@<IP-ADDRESS>NLP Benchmark example:
deepsparse.benchmark zoo:nlp/question_answering/bert-base/pytorch/huggingface/squad/pruned95_obs_quant-none -i [64,128] -b 64 -nstreams 1 -s syncCV Server example:
deepsparse.server \
task image_classification \
--model_path "zoo:cv/classification/resnet_v1-50/pytorch/sparseml/imagenet/pruned95-none"After server is up and running pass in our droplet's IP address and default port number (5543) to the URL to check out swagger:
http://<IP-ADDRESS@5543/docs
NLP Inline Python example:
open Python in shell:
python3paste the following code snippet:
from deepsparse import Pipeline
qa_pipeline = Pipeline.create(task="question-answering")
inference = qa_pipeline(question="What's my name?", context="My name is Snorlax")
print(inference)CV Inline Python example:
Get an example image:
wget -O basilica.jpg https://raw.githubusercontent.com/neuralmagic/deepsparse/main/src/deepsparse/yolo/sample_images/basilica.jpgRun inference:
from deepsparse import Pipeline
model_path = "zoo:cv/detection/yolov8-s/pytorch/ultralytics/coco/pruned50_quant-none"
images = ["basilica.jpg"]
yolo_pipeline = Pipeline.create(
task="yolov8",
model_path=model_path,
)
pipeline_outputs = yolo_pipeline(images=images)
print(pipeline_outputs)