The World's Leading Cross Platform AI Engine for Edge Devices, with over 3.2 million
installs on Docker Hub.
Website: https://deepstack.cc
Documentation: https://docs.deepstack.cc
Forum: https://forum.deepstack.cc
Dev Center: https://dev.deepstack.cc
DeepStack is owned and maintained by DeepQuest AI.
DeepStack is an AI API engine that serves pre-built models and custom models on multiple edge devices locally or on your private cloud. Supported platforms are:
- Linux OS via Docker ( CPU and NVIDIA GPU support )
- Mac OS via Docker
- Windows 10 ( native application )
- NVIDIA Jetson via Docker.
DeepStack runs completely offline and independent of the cloud. You can also install and run DeepStack on any cloud VM with docker installed to serve as your private, state-of-the-art and real-time AI server.
Visit https://docs.deepstack.cc/getting-started for installation instructions. The documentation provides example codes for the following programming languages with more to be added soon.
- Python
- C#
- NodeJS
-
Install Prerequisites
-
Clone DeepStack Repo
git clone https://github.com/johnolafenwa/DeepStack.git
-
CD to DeepStack Repo Dir
cd DeepStack
-
Fetch Repo Files
git lfs pull
-
Build DeepStack Server
cd server && go build
-
Build DeepStack CPU Version
cd .. && sudo docker build -t deepquestai/deepstack:cpu . -f Dockerfile.cpu
-
Build DeepStack GPU Version
sudo docker build -t deepquestai/deepstack:gpu . -f Dockerfile.gpu
-
Build DeepStack Jetson Version
sudo docker build -t deepquestai/deepstack:jetpack . -f Dockerfile.gpu-jetpack
-
Running and Testing Locally Without Building
-
Unless you wish to install requirements system wide, create a virtual environment with
python3.7 -m venv venv
and activate withsource venv/bin/activate
-
Install Requirements with
pip3 install -r requirements.txt
-
For CPU Version, Install PyTorch with
pip3 install torch==1.6.0+cpu torchvision==0.7.0+cpu -f https://download.pytorch.org/whl/torch_stable.html
-
For GPU Version, Install Pytorch with
pip3 install torch==1.6.0+cu101 torchvision==0.7.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html
-
Start Powershell
pwsh
-
For CPU Version, Run
.\setup_docker_cpu.ps1
-
For GPU Version, Run
.\setup_docker_gpu.ps1
-
CD To Server Dir
cd server
-
Build DeepStack Server
go build
-
Set Any of the APIS to enable;
$env:VISION_DETECTION = "True"
,$env:VISION_FACE = "True"
,$env:VISION_SCENE = "True"
-
Run DeepStack
.\server
You can find all logs in the
directory
in the repo root. Note that DeepStack will be running on the default port5000
. -
The DeepStack ecosystem includes a number of popular integrations and libraries built to expand the functionalities of the AI engine to serve IoT, industrial, monitoring and research applications. A number of them are listed below
- HASS-DeepStack-Object
- HASS-DeepStack-Face
- HASS-DeepStack-Scene
- DeepStack with Blue Iris - YouTube video
- DeepStack with Blue Iris - Forum Discussion
- DeepStack on Home Assistant
- DeepStack-UI
- DeepStack-Python Helper
- DeepStack-Analytics
- DeepStackAI Trigger
(coming soon)