/intel_models

Intel® AI Reference Models: contains Intel optimizations for running deep learning workloads on Intel® Xeon® Scalable processors and Intel® Data Center GPUs

Primary LanguagePythonApache License 2.0Apache-2.0

Intel® AI Reference Models

This repository contains links to pre-trained models, sample scripts, best practices, and step-by-step tutorials for many popular open-source machine learning models optimized by Intel to run on Intel® Xeon® Scalable processors and Intel® Data Center GPUs.

Containers for running the workloads can be found at the Intel® Developer Catalog.

Intel® AI Reference Models in a Jupyter Notebook is also available for the listed workloads

Purpose of Intel® AI Reference Models

Intel optimizes popular deep learning frameworks such as TensorFlow* and PyTorch* by contributing to the upstream projects. Additional optimizations are built into plugins/extensions such as the Intel Extension for Pytorch* and the Intel Extension for TensorFlow*. Popular neural network models running against common datasets are the target workloads that drive these optimizations.

The purpose of the Intel® AI Reference Models repository (and associated containers) is to quickly replicate the complete software environment that demonstrates the best-known performance of each of these target model/dataset combinations. When executed in optimally-configured hardware environments, these software environments showcase the AI capabilities of Intel platforms.

DISCLAIMER: These scripts are not intended for benchmarking Intel platforms. For any performance and/or benchmarking information on specific Intel platforms, visit https://www.intel.ai/blog.

Intel is committed to the respect of human rights and avoiding complicity in human rights abuses, a policy reflected in the Intel Global Human Rights Principles. Accordingly, by accessing the Intel material on this platform you agree that you will not use the material in a product or application that causes or contributes to a violation of an internationally recognized human right.

License

The Intel® AI Reference Models is licensed under Apache License Version 2.0.

Datasets

To the extent that any public datasets are referenced by Intel or accessed using tools or code on this site those datasets are provided by the third party indicated as the data source. Intel does not create the data, or datasets, and does not warrant their accuracy or quality. By accessing the public dataset(s) you agree to the terms associated with those datasets and that your use complies with the applicable license.

Please check the list of datasets used in Intel® AI Reference Models in datasets directory.

Intel expressly disclaims the accuracy, adequacy, or completeness of any public datasets, and is not liable for any errors, omissions, or defects in the data, or for any reliance on the data. Intel is not liable for any liability or damages relating to your use of public datasets.

Use cases

The model documentation in the tables below have information on the prerequisites to run each model. The model scripts run on Linux. Certain models are also able to run using bare metal on Windows. For more information and a list of models that are supported on Windows, see the documentation here.

Instructions available to run on Sapphire Rapids.

For best performance on Intel® Data Center GPU Flex and Max Series, please check the list of supported workloads. It provides instructions to run inference and training using Intel(R) Extension for PyTorch or Intel(R) Extension for TensorFlow.

Image Recognition

Model Framework Mode Model Documentation Benchmark/Test Dataset
DenseNet169 TensorFlow Inference FP32 ImageNet 2012
Inception V3 TensorFlow Inference Int8 FP32 ImageNet 2012
Inception V4 TensorFlow Inference Int8 FP32 ImageNet 2012
MobileNet V1* TensorFlow Inference Int8 FP32 BFloat16 ImageNet 2012
MobileNet V1* Sapphire Rapids TensorFlow Inference Int8 FP32 BFloat16 BFloat32 ImageNet 2012
MobileNet V2 Tensorflow Inference FP32 BFloat16 Int8 ImageNet 2012
ResNet 101 TensorFlow Inference Int8 FP32 ImageNet 2012
ResNet 50 TensorFlow Inference Int8 FP32 ImageNet 2012
ResNet 50v1.5 TensorFlow Inference Int8 FP32 BFloat16 FP16 ImageNet 2012
ResNet 50v1.5 Sapphire Rapids TensorFlow Inference Int8 FP32 BFloat16 BFloat32 ImageNet 2012
ResNet 50v1.5 TensorFlow Training FP32 BFloat16 FP16 ImageNet 2012
ResNet 50v1.5 Sapphire Rapids TensorFlow Training FP32 BFloat16 BFloat32 ImageNet 2012
Inception V3 TensorFlow Serving Inference FP32 Synthetic Data
ResNet 50v1.5 TensorFlow Serving Inference FP32 Synthetic Data
GoogLeNet PyTorch Inference FP32 BFloat16 ImageNet 2012
Inception v3 PyTorch Inference FP32 BFloat16 ImageNet 2012
MNASNet 0.5 PyTorch Inference FP32 BFloat16 ImageNet 2012
MNASNet 1.0 PyTorch Inference FP32 BFloat16 ImageNet 2012
ResNet 50 PyTorch Inference FP32 BFloat16 BFloat32 ImageNet 2012
ResNet 50 PyTorch Training FP32 BFloat16 BFloat32 ImageNet 2012
ResNet 101 PyTorch Inference FP32 BFloat16 ImageNet 2012
ResNet 152 PyTorch Inference FP32 BFloat16 ImageNet 2012
ResNext 32x4d PyTorch Inference FP32 BFloat16 ImageNet 2012
ResNext 32x16d PyTorch Inference FP32 BFloat16 BFloat32 ImageNet 2012
VGG-11 PyTorch Inference FP32 BFloat16 ImageNet 2012
VGG-11 with batch normalization PyTorch Inference FP32 BFloat16 ImageNet 2012
Wide ResNet-50-2 PyTorch Inference FP32 BFloat16 ImageNet 2012
Wide ResNet-101-2 PyTorch Inference FP32 BFloat16 ImageNet 2012

Image Segmentation

Model Framework Mode Model Documentation Benchmark/Test Dataset
3D U-Net MLPerf* TensorFlow Inference FP32 BFloat16 Int8 BRATS 2019
3D U-Net MLPerf* Sapphire Rapids Tensorflow Inference FP32 BFloat16 Int8 BFloat32 BRATS 2019
MaskRCNN TensorFlow Inference FP32 MS COCO 2014
UNet TensorFlow Inference FP32

Language Modeling

Model Framework Mode Model Documentation Benchmark/Test Dataset
BERT large TensorFlow Inference FP32 BFloat16 FP16 SQuAD
BERT large TensorFlow Training FP32 BFloat16 FP16 SQuAD and MRPC
BERT large Sapphire Rapids Tensorflow Inference FP32 BFloat16 Int8 BFloat32 SQuAD
BERT large Sapphire Rapids Tensorflow Training FP32 BFloat16 BFloat32 SQuAD
DistilBERT base Tensorflow Inference FP32 BFloat16 Int8 FP16 SST-2
BERT base PyTorch Inference FP32 BFloat16 BERT Base SQuAD1.1
BERT large PyTorch Inference FP32 Int8 BFloat16 BFloat32 BERT Large SQuAD1.1
BERT large PyTorch Training FP32 BFloat16 BFloat32 preprocessed text dataset
DistilBERT base PyTorch Inference FP32 Int8 BFloat16 BFloat32 DistilBERT Base SQuAD1.1
RNN-T PyTorch Inference FP32 BFloat16 BFloat32 RNN-T dataset
RNN-T PyTorch Training FP32 BFloat16 BFloat32 RNN-T dataset
RoBERTa base PyTorch Inference FP32 BFloat16 RoBERTa Base SQuAD 2.0
T5 PyTorch Inference FP32 Int8

Language Translation

Model Framework Mode Model Documentation Benchmark/Test Dataset
BERT TensorFlow Inference FP32 MRPC
GNMT* TensorFlow Inference FP32 MLPerf GNMT model benchmarking dataset
Transformer_LT_mlperf* TensorFlow Inference FP32 BFloat16 Int8 WMT English-German data
Transformer_LT_mlperf* Sapphire Rapids Tensorflow Inference FP32 BFloat16 Int8 BFloat32 WMT English-German dataset
Transformer_LT_mlperf* TensorFlow Training FP32 BFloat16 WMT English-German dataset
Transformer_LT_mlperf* Sapphire Rapids Tensorflow Training FP32 BFloat16 BFloat32 WMT English-German dataset
Transformer_LT_Official TensorFlow Inference FP32 WMT English-German dataset
Transformer_LT_Official TensorFlow Serving Inference FP32

Object Detection

Model Framework Mode Model Documentation Benchmark/Test Dataset
R-FCN TensorFlow Inference Int8 FP32 COCO 2017 validation dataset
SSD-MobileNet* TensorFlow Inference Int8 FP32 BFloat16 COCO 2017 validation dataset
SSD-MobileNet* Sapphire Rapids TensorFlow Inference Int8 FP32 BFloat16 BFloat32 COCO 2017 validation dataset
SSD-ResNet34* TensorFlow Inference Int8 FP32 BFloat16 COCO 2017 validation dataset
SSD-ResNet34* Sapphire Rapids TensorFlow Inference Int8 FP32 BFloat16 BFloat32 COCO 2017 validation dataset
SSD-ResNet34 TensorFlow Training FP32 BFloat16 COCO 2017 training dataset
SSD-ResNet34 Sapphire Rapids TensorFlow Training FP32 BFloat16 BFloat32 COCO 2017 training dataset
SSD-MobileNet TensorFlow Serving Inference FP32
Faster R-CNN ResNet50 FPN PyTorch Inference FP32 BFloat16 COCO 2017
Mask R-CNN PyTorch Inference FP32 BFloat16 BFloat32 COCO 2017
Mask R-CNN PyTorch Training FP32 BFloat16 BFloat32 COCO 2017
Mask R-CNN ResNet50 FPN PyTorch Inference FP32 BFloat16 COCO 2017
RetinaNet ResNet-50 FPN PyTorch Inference FP32 BFloat16 COCO 2017
SSD-ResNet34 PyTorch Inference FP32 Int8 BFloat16 BFloat32 COCO 2017
SSD-ResNet34 PyTorch Training FP32 BFloat16 BFloat32 COCO 2017

Recommendation

Model Framework Mode Model Documentation Benchmark/Test Dataset
DIEN TensorFlow Inference FP32 BFloat16 DIEN dataset
DIEN Sapphire Rapids TensorFlow Inference FP32 BFloat16 BFloat32 DIEN dataset
DIEN TensorFlow Training FP32 DIEN dataset
DIEN Sapphire Rapids TensorFlow Training FP32 BFloat16 BFloat32 DIEN dataset
NCF TensorFlow Inference FP32 MovieLens 1M
Wide & Deep TensorFlow Inference FP32 Census Income dataset
Wide & Deep Large Dataset TensorFlow Inference Int8 FP32 Large Kaggle Display Advertising Challenge dataset
Wide & Deep Large Dataset TensorFlow Training FP32 Large Kaggle Display Advertising Challenge dataset
DLRM PyTorch Inference FP32 Int8 BFloat16 BFloat32 Criteo Terabyte
DLRM PyTorch Training FP32 BFloat16 BFloat32 Criteo Terabyte
DLRM v2 PyTorch Inference FP32 FP16 BFloat16 BFloat32 Int8 Criteo 1TB Click Logs dataset
DLRM v2 PyTorch Training FP32 FP16 BFloat16 BFloat32 Random dataset
MEMREC-DLRM PyTorch Inference FP32 Criteo Terabyte

Text-to-Speech

Model Framework Mode Model Documentation Benchmark/Test Dataset
WaveNet TensorFlow Inference FP32

Shot Boundary Detection

Model Framework Mode Model Documentation Benchmark/Test Dataset
TransNetV2 PyTorch Inference FP32 BFloat16 Synthetic Data

AI Drug Design (AIDD)

Model Framework Mode Model Documentation Benchmark/Test Dataset
AlphaFold2 PyTorch Inference FP32 AF2Dataset

*Means the model belongs to MLPerf models and will be supported long-term.

Intel® Data Center GPU Workloads

Model Framework Mode GPU Type Model Documentation
ResNet 50v1.5 TensorFlow Inference Flex Series Int8
ResNet 50 v1.5 PyTorch Inference Flex Series Int8
SSD-MobileNet* TensorFlow Inference Flex Series Int8
SSD-MobileNet PyTorch Inference Flex Series Int8
Yolo V4 PyTorch Inference Flex Series Int8
EfficientNet TensorFlow Inference Flex Series FP16
MaskRCNN TensorFlow Inference Flex Series FP16
Stable Diffusion TensorFlow Inference Flex Series FP16 FP32
Stable Diffusion PyTorch Inference Flex Series FP16 FP32
Yolo V5 PyTorch Inference Flex Series FP16
ResNet 50v1.5 TensorFlow Inference Max Series Int8 FP32 FP16
ResNet 50 v1.5 TensorFlow Training Max Series BFloat16
ResNet 50 v1.5 PyTorch Inference Max Series Int8
ResNet 50 v1.5 PyTorch Training Max Series BFloat16
BERT large PyTorch Inference Max Series FP16
BERT large PyTorch Training Max Series BFloat16
BERT large TensorFlow Inference Max Series FP32 FP16
BERT large TensorFlow Training Max Series BFloat16
DLRM TensorFlow Inference Max Series FP16
DLRM TensorFlow Training Max Series BFloat16

How to Contribute

If you would like to add a new benchmarking script, please use this guide.