/onnx-web-presentation

Primary LanguageJupyter NotebookMIT LicenseMIT

onnx-web-presentation

Click link to view Presentation Material


Quick Start - Create an ONNX model using ML.NET

Open notebooks folder in Visual Studio Code Install .NET Interactive Notebooks Open ExportToOnnx.ipynb - this is a C# application that trains a ML.NET model and convert to onnx. Click "Run All" button to see results. It should generate model.onnx file


Quick Start - Nodejs Binding

This example is a demonstration of basic usage of ONNX Runtime Node.js binding.

This example contains package.json file, which lists "onnxruntime-node" as dependency. To work on your own package.json, use command npm install onnxruntime-node to install ONNX Runtime Node.js binding.

In this example, we load onnxruntime, load an image, create an inference session with the model we generated from a .net notebook, feed input, get output as result and display to standard console. All functions are called in their basic form.

Usage

cd node
npm install
node .

Quick Start - Web Browser Binding

This example is a demonstration of basic usage of ONNX Runtime Web binding.

This example contains package.json file, uses live-server as dependency. The index.html contains link to ONNX Runtime (ort) library https://cdn.jsdelivr.net/npm/onnxruntime-web/dist/ort.min.js

In this example, we load onnxruntime on browser, load model locally, create an inference session that uses FER+ Emotion Recognition ONNX model , load image, convert to feed input properly, get output as result and display.

Usage

cd browser
npm install
npx light-server -s . -p 8081

go to http://localhost:8081/

load an image with face and emotion

the result will be displayed