/multithreaded-estimators

Multithreading inference in Tensorflow Estimators. This is a ServiceNow Research project that was started at Element AI.

Primary LanguagePythonApache License 2.0Apache-2.0

ServiceNow completed its acquisition of Element AI on January 8, 2021. All references to Element AI in the materials that are part of this project should refer to ServiceNow.

Multithreaded-estimators

Code demonstrating how to use multithreading to speedup inference for Tensorflow estimators.

Installation

A Dockerfile is provided. First build the image from the root directory:

docker build . -t threaded

Then run the tests:

docker run threaded

License

This code is released under an Apache 2 license. See the license in full.