/webnn-polyfill

🧠⚙️ Web Neural Network API polyfill based on TensorFlow.js

Primary LanguageJavaScriptApache License 2.0Apache-2.0

build and test deploy

Note

This repository is archived. This polyfill demonstrated early feasibility of the WebNN API. Now that the native implementations across multiple backends track the latest specification closely all web developers are advised to use the native implementations for their development and experimentation needs.

WebNN Polyfill

A JavaScript implementation of the Web Neural Network API.

Backends

The implementation of this webnn-polyfill is based on TensorFlow.js that supports the following 3 backends:

Notes

  • CPU backend is the only supported backend for Node.js.
  • WASM backend does not support all the ops and some test failures are thus expected.

Usage

Import the packages

Via NPM

import '@webmachinelearning/webnn-polyfill';

Via a script tag

<script src="https://cdn.jsdelivr.net/npm/@webmachinelearning/webnn-polyfill/dist/webnn-polyfill.js"></script>

Set backend

WebNN Polyfill requires setting backend to enable TensorFlow.js.

  • When running in Node.js, recommend using CPU backend for its higher numerical precision.
    const backend = 'cpu';
    const context = await navigator.ml.createContext();
    const tf = context.tf;
    await tf.setBackend(backend);
    await tf.ready();
  • When running in browsers, recommend using WebGL backend for better performance.
    const backend = 'webgl'; // 'cpu' or 'wasm'
    const context = await navigator.ml.createContext();
    const tf = context.tf;
    await tf.setBackend(backend);
    await tf.ready();
  • When running in browsers with WASM backend.
    const backend = 'wasm';
    const context = await navigator.ml.createContext();
    const wasm = context.wasm;

    // 1- Enforce use Wasm SIMD binary
    wasm.setWasmPath(`${path}/tfjs-backend-wasm-simd.wasm`);

    // 2- Use Wasm SIMD + Threads bianry if supported both SIMD and Threads
    // 2.1- Configure by the path to the directory where the WASM binaries are located
    //        wasm.setWasmPaths(`https://unpkg.com/@tensorflow/tfjs-backend-wasm@${tf.version_core}/dist/`);
    //      or mapping from names of WASM binaries to custom full paths specifying the locations of those binaries
    //        wasm.setWasmPaths({
    //          'tfjs-backend-wasm.wasm': 'renamed.wasm',
    //          'tfjs-backend-wasm-simd.wasm': 'renamed-simd.wasm',
    //          'tfjs-backend-wasm-threaded-simd.wasm': 'renamed-threaded-simd.wasm'
    //        });
    wasm.setWasmPaths(${prefixOrFileMap}); 
    // 2.2- Configure threads number manually, or it will use the number of logical CPU cores as the threads count by default
    wasm.setThreadsCount(n); // n can be 1, 2, ...

    const tf = context.tf;
    await tf.setBackend(backend);
    await tf.ready();

Please refer to the setPolyfillBackend() usage in tests for concrete examples on how to best implement backend switching for your project.

Samples

Web Machine Learning Community Group provides various Samples (GitHub repo) that make use of the WebNN API. These samples fall back to the webnn-polyfill if the browser does not have a native implementation of the WebNN API available by default.

Build and Test

Setup

> git clone --recurse-submodules https://github.com/webmachinelearning/webnn-polyfill
> cd webnn-polyfill & npm install

Build

Development build

> npm run build

Production build

> npm run build-production

Test

Run tests in node.js.

> npm test

Run tests in web browser.

> npm start

Open the web browser and navigate to http://localhost:8080/test

Default backend is CPU backend, you could change to use WebGL backend by http://localhost:8080/test?backend=webgl,
or use Wasm backend by http://localhost:8080/test?backend=wasm

Run only CTS tests in node.js.

> npm run test-cts

Run only CTS tests in web browser.

> npm start

Open the web browser and navigate to http://localhost:8080/test/cts.html

Other scripts

Build docs

> npm run build-docs

Lint

> npm run lint

Format

> npm run format

Start dev server

> npm run dev

Watch files

> npm run watch

License

This project is licensed under the Apache License Version 2.0.