/DeepStack

The World's Leading Cross Platform AI Engine for Edge Devices

Primary LanguagePythonApache License 2.0Apache-2.0

DeepStack

The World's Leading Cross Platform AI Engine for Edge Devices, with over 10 million installs on Docker Hub.

black

license

DevTest

Website: https://deepstack.cc

Documentation: https://docs.deepstack.cc

Forum: https://forum.deepstack.cc

Dev Center: https://dev.deepstack.cc

DeepStack is owned and maintained by DeepQuest AI.

Introduction

DeepStack is an AI API engine that serves pre-built models and custom models on multiple edge devices locally or on your private cloud. Supported platforms are:

  • Linux OS via Docker ( CPU and NVIDIA GPU support )
  • Mac OS via Docker
  • Windows 10 ( native application, CPU and GPU )
  • NVIDIA Jetson via Docker.
  • Rasperry Pi & ARM64 Devices via Docker.

DeepStack runs completely offline and independent of the cloud. You can also install and run DeepStack on any cloud VM with docker installed to serve as your private, state-of-the-art and real-time AI server.

Features

  • Face APIs: Face detection, recognition and matching.

    Face API

  • Common Objects APIs: Object detection for 80 common objects

    Detection API

  • Custom Models: Train and deploy new models to detect any custom object(s)

    Custom Models API

  • Image Enhance: 4X image superresolution

    Input

    Image Enhance API Iput

    Output Image Enhance API Iput

  • Scene Recognition: Image scene recognition

  • SSL Support

  • API Key support: Security options to protect your DeepStack endpoints

Installation and Usage

Visit https://docs.deepstack.cc/getting-started for installation instructions. The documentation provides example codes for the following programming languages with more to be added soon.

  • Python
  • C#
  • NodeJS

Build from Source (For Docker Version)

  • Install Prerequisites

  • Clone DeepStack Repo

    git clone https://github.com/johnolafenwa/DeepStack.git

  • CD to DeepStack Repo Dir

    cd DeepStack

  • Fetch Repo Files

    git lfs pull

  • Download Binary Dependencies With Powershell .\download_dependencies.ps1

  • Build DeepStack CPU Version

    cd .. && sudo docker build -t deepquestai/deepstack:cpu . -f Dockerfile.cpu

  • Build DeepStack GPU Version

    sudo docker build -t deepquestai/deepstack:gpu . -f Dockerfile.gpu

  • Build DeepStack Jetson Version

    sudo docker build -t deepquestai/deepstack:jetpack . -f Dockerfile.gpu-jetpack

  • Running and Testing Locally Without Building

    • Unless you wish to install requirements system wide, create a virtual environment with python3.7 -m venv venv and activate with source venv/bin/activate

    • Install Requirements with pip3 install -r requirements.txt

    • For CPU Version, Install PyTorch with pip3 install torch==1.6.0+cpu torchvision==0.7.0+cpu -f https://download.pytorch.org/whl/torch_stable.html

    • For GPU Version, Install Pytorch with pip3 install torch==1.6.0+cu101 torchvision==0.7.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html

    • Start Powershell pwsh

    • For CPU Version, Run .\setup_docker_cpu.ps1

    • For GPU Version, Run .\setup_docker_gpu.ps1

    • CD To Server Dir cd server

    • Build DeepStack Server go build

    • Set Any of the APIS to enable; $env:VISION_DETECTION = "True", $env:VISION_FACE = "True", $env:VISION_SCENE = "True"

    • Run DeepStack .\server

    You can find all logs in the directory in the repo root. Note that DeepStack will be running on the default port 5000.

Integrations and Community

The DeepStack ecosystem includes a number of popular integrations and libraries built to expand the functionalities of the AI engine to serve IoT, industrial, monitoring and research applications. A number of them are listed below

Contributors Guide

(coming soon)