Quickstart | Installation | Documentation
Flashlight is a fast, flexible machine learning library written entirely in C++ from the Facebook AI Research Speech team and the creators of Torch and Deep Speech. Its core features include:
- Just-in-time kernel compilation with modern C++ with the ArrayFire tensor library.
- CUDA and CPU backends for GPU and CPU training.
- An emphasis on efficiency and scale.
Native support in C++ and simple extensibility makes Flashlight a powerful research framework that's hackable to its core and enables fast iteration on new experimental setups and algorithms without sacrificing performance. In a single repository, Flashlight provides apps for research across multiple domains:
- Automatic speech recognition (the wav2letter project) — Documentation | Tutorial
- Image classification
- Language modeling
Flashlight is broken down into a few parts:
flashlight/lib
contains kernels and standalone utilities for sequence losses, beam search decoding, text processing, and more.flashlight/fl
is the core neural network library using the ArrayFire tensor library.flashlight/app
are applications of the core library to machine learning across domains.flashlight/ext
are extensions on top of Flashlight and ArrayFire that are useful across apps.
First, build and install Flashlight and link it to your own project.
Sequential
forms a sequence of Flashlight Module
s for chaining computation.
Implementing a simple convnet is easy.
#include <flashlight/fl/flashlight.h>
Sequential model;
model.add(View(af::dim4(IM_DIM, IM_DIM, 1, -1)));
model.add(Conv2D(
1 /* input channels */,
32 /* output channels */,
5 /* kernel width */,
5 /* kernel height */,
1 /* stride x */,
1 /* stride y */,
PaddingMode::SAME; /* padding mode */,
PaddingMode::SAME; /* padding mode */));
model.add(ReLU());
model.add(Pool2D(
2 /* kernel width */,
2 /* kernel height */,
2 /* stride x */,
2 /* stride y */));
model.add(Conv2D(32, 64, 5, 5, 1, 1, PaddingMode::SAME;, PaddingMode::SAME;));
model.add(ReLU());
model.add(Pool2D(2, 2, 2, 2));
model.add(View(af::dim4(7 * 7 * 64, -1)));
model.add(Linear(7 * 7 * 64, 1024));
model.add(ReLU());
model.add(Dropout(0.5));
model.add(Linear(1024, 10));
model.add(LogSoftmax());
Performing forward and backward computation is straightforwards:
auto output = model.forward(input);
auto loss = categoricalCrossEntropy(output, target);
loss.backward();
See the MNIST example for a full tutorial including a training loop and dataset abstractions.
Variable
is the base Flashlight tensor that operates on ArrayFire array
s. Tape-based Automatic differentiation in Flashlight is simple and works as you'd expect.
Autograd Example
auto A = Variable(af::randu(1000, 1000), true /* calcGrad */);
auto B = 2.0 * A;
auto C = 1.0 + B;
auto D = log(C);
D.backward(); // populates A.grad() along with gradients for B, C, and D.
Install with vcpkg
| With Docker | From Source | From Source with vcpkg
| Build Your Project with Flashlight
At minimum, compilation requires:
- A C++ compiler with good C++14 support (e.g. gcc/g++ >= 5)
- CMake — version 3.10 or later, and
make
- A Linux-based operating system.
See the full dependency list for more details if building from source.
Instructions for building/installing Python bindings can be found here.
Flashlight can be broken down into several components as described above. Each component can be incrementally built by specifying the correct build options.
There are two ways to work with Flashlight:
- As an installed library that you link to with your own project. This is best for building standalone applications dependent on Flashlight.
- With in-source development where the Flashlight project source is changed and rebuilt. This is best if customizing/hacking the core framework or the Flashlight-provided app binaries.
Flashlight can be built in one of two ways:
- With
vcpkg
, a C++ package manager. - From source by installing dependencies as needed.
Flashlight is most-easily built and installed with vcpkg
. Both the CUDA and CPU backends are supported with vcpkg
. For either backend, first install Intel MKL. For the CUDA backend, install CUDA
>= 9.2, cuDNN
, and NCCL
. Then, after installing vcpkg
, install the libraries and core with:
./vcpkg install flashlight-cuda # CUDA backend, OR
./vcpkg install flashlight-cpu # CPU backend
To install Flashlight apps, check the features available for installation by running ./vcpkg search flashlight-cuda
or ./vcpkg search flashlight-cpu
. Each app is a "feature": for example, ./vcpkg install flashlight-cuda[asr]
installs the ASR app with the CUDA backend.
Below is the currently-supported list of features (for each of flashlight-cuda
and flashlight-cpu
):
flashlight-{cuda/cpu}[lib] # Flashlight libraries
flashlight-{cuda/cpu}[nn] # Flashlight neural net library
flashlight-{cuda/cpu}[asr] # Flashlight speech recognition app
flashlight-{cuda/cpu}[lm] # Flashlight language modeling app
flashlight-{cuda/cpu}[imgclass] # Flashlight image classification app
Flashlight app binaries are also built for the selected features and are installed into the vcpkg
install tree's tools
directory.
Integrating Flashlight into your own project with is simple using vcpkg
's CMake toolchain integration.
First, install the dependencies for your backend of choice using vcpkg
(click to expand the below):
Installing CUDA Backend Dependencies with vcpkg
To build the Flashlight CUDA backend from source using dependencies installed with vcpkg
, install CUDA
>= 9.2, cuDNN
, NCCL
, and Intel MKL, then build the rest of the dependencies for the CUDA backend based on which Flashlight features you'd like to build:
./vcpkg install \
cuda intel-mkl fftw3 cub kenlm \ # if building flashlight libraries
arrayfire[cuda] cudnn nccl openmpi cereal stb \ # if building the flashlight neural net library
gflags glog \ # if building any flashlight apps
libsndfile \ # if building the flashlight asr app
gtest # optional, if building tests
Installing CPU Backend Dependencies with vcpkg
To build the Flashlight CPU backend from source using dependencies installed with vcpkg
, install Intel MKL, then build the rest of the dependencies for the CPU backend based on which Flashlight features you'd like to build:
./vcpkg install \
intel-mkl fftw3 kenlm \ # for flashlight libraries
arrayfire[cpu] gloo[mpi] openmpi onednn cereal stb \ # for the flashlight neural net library
gflags glog \ # for any flashlight apps
libsndfile \ # for the flashlight asr app
gtest # optional, for tests
To build Flashlight from source with these dependencies, clone the repository:
git clone https://github.com/facebookresearch/flashlight.git && cd flashlight
mkdir -p build && cd build
Then, build from source using vcpkg
's CMake toolchain:
cmake .. \
-DCMAKE_BUILD_TYPE=Release
-DFL_BACKEND=CUDA
-DCMAKE_TOOLCHAIN_FILE=[path to your vcpkg clone]/scripts/buildsystems/vcpkg.cmake
make -j$(nproc)
make install -j$(nproc) # only if you want to install Flashlight for external use
To build a subset of Flashlight's features, see the build options below.
To build from source, first install the below dependencies. Most are available with your system's local package manager.
Some dependencies marked below are downloaded and installed automatically if not found on the local system. FL_BUILD_STANDALONE
determines this behavior — if disabled, dependencies won't be downloaded and built when building Flashlight.
Once all dependencies are installed, clone the repository:
git clone https://github.com/facebookresearch/flashlight.git && cd flashlight
mkdir -p build && cd build
Then build all Flashlight components with:
cmake .. -DCMAKE_BUILD_TYPE=Release -DFL_BACKEND=[backend] [...build options]
make -j$(nproc)
make install
Setting the MKLROOT
environment variable (export MKLROOT=/opt/intel/mkl
on most Linux-based systems) can help CMake find Intel MKL if not initially found.
To build a smaller subset of Flashlight features/apps, see the build options below for a complete list of options.
To install Flashlight in a custom directory, use CMake's CMAKE_INSTALL_PREFIX
argument. Flashlight libraries can be built as shared libraries using CMake's BUILD_SHARED_LIBS
argument.
Flashlight uses modern CMake and IMPORTED
targets for most dependencies. If a dependency isn't found, passing -D<package>_DIR
to your cmake
command or exporting <package>_DIR
as an environment variable equal to the path to <package>Config.cmake
can help locate dependencies on your system. See the documentation for more details. If CMake is failing to locate a package, check to see if a corresponding issue has already been created before creating your own.
Dependencies marked with *
are automatically downloaded and built from source if not found on the system. Setting FL_BUILD_STANDALONE
to OFF
disables this behavior.
Dependencies marked with ^
are required if building with distributed training enabled (FL_BUILD_DISTRIBUTED
— see the build options below). Distributed training is required for all apps.
Dependencies marked with †
are installable via vcpkg
. See the instructions for installing those dependencies above for doing a Flashlight from-source build.
Component | Backend | Dependencies |
---|---|---|
libraries | CUDA | CUDA >= 9.2, CUB*† (if CUDA < 11) |
CPU | A BLAS library (Intel MKL >= 2018, OpenBLAS†, etc) | |
core | Any | ArrayFire >= 3.7.3†, an MPI library^(OpenMPI†, etc), cereal*† >= 1.3.0, stb*† |
CUDA | CUDA >= 9.2, NCCL^, cuDNN | |
CPU | oneDNN† >= 2.0, gloo (with MPI)*^† | |
app: all | Any | Google Glog†, Gflags† |
app: asr | Any | libsndfile*† >= 10.0.28, a BLAS library (Intel MKL >= 2018, OpenBLAS†, etc) |
app: imgclass | Any | - |
app: lm | Any | - |
tests | Any | Google Test (gtest, with gmock)*† >= 1.10.0 |
The Flashlight CMake build accepts the following build options (prefixed with -D
when running CMake from the command line):
Name | Options | Default Value | Description |
---|---|---|---|
FL_BACKEND | CUDA, CPU, OPENCL | CUDA | Backend with which to build all components. |
FL_BUILD_STANDALONE | ON, OFF | ON | Downloads/builds some dependencies if not found. |
FL_BUILD_LIBRARIES | ON, OFF | ON | Build the Flashlight libraries. |
FL_BUILD_CORE | ON, OFF | ON | Build the Flashlight neural net library. |
FL_BUILD_DISTRIBUTED | ON, OFF | ON | Build with distributed training; required for apps. |
FL_BUILD_CONTRIB | ON, OFF | ON | Build contrib APIs subject to breaking changes. |
FL_BUILD_APPS | ON, OFF | ON | Build apps (see below). |
FL_BUILD_APP_ASR | ON, OFF | ON | Build the automatic speech recognition app. |
FL_BUILD_APP_IMGCLASS | ON, OFF | ON | Build the image classification app. |
FL_BUILD_APP_LM | ON, OFF | ON | Build the language modeling app. |
FL_BUILD_APP_ASR_TOOLS | ON, OFF | ON | Build automatic speech recognition app tools. |
FL_BUILD_TESTS | ON, OFF | ON | Build tests. |
FL_BUILD_EXAMPLES | ON, OFF | ON | Build examples. |
FL_BUILD_EXPERIMENTAL | ON, OFF | OFF | Build experimental components. |
CMAKE_BUILD_TYPE | See docs. | Debug | See the CMake documentation. |
CMAKE_INSTALL_PREFIX | [Directory] | See docs. | See the CMake documentation. |
Flashlight is most-easily linked to using CMake. Flashlight exports the following CMake targets when installed:
flashlight::fl-libraries
— contains flashlight libraries headers and symbols.flashlight::flashlight
— contains flashlight libraries as well as the flashlight core autograd and neural network library.flashlight::flashlight-app-asr
— contains the automatic speech recognition app along with the flashlight core and flashlight libraries.flashlight::flashlight-app-imgclass
— contains the image classification app along with the flashlight core and flashlight libraries.flashlight::flashlight-app-lm
— contains the language modeling app along with the flashlight core and flashlight libraries.
Given a simple project.cpp
file that includes and links to Flashlight:
#include <iostream>
#include <arrayfire.h>
#include <flashlight/fl/flashlight.h>
int main() {
fl::Variable v(af::constant(1, 1), true);
auto result = v + 10;
std::cout << "Hello World!" << std::endl;
af::print("Array value is ", result.array()); // 11.000
return 0;
}
The following CMake configuration links Flashlight and sets include directories:
cmake_minimum_required(VERSION 3.10)
set(CMAKE_CXX_STANDARD 14)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
add_executable(myProject project.cpp)
find_package(flashlight CONFIG REQUIRED)
target_link_libraries(myProject PRIVATE flashlight::flashlight)
If you installed Flashlight with vcpkg
, the above CMake configuration for myProject
can be built by running:
cd project && mkdir build && cd build
cmake .. \
-DCMAKE_TOOLCHAIN_FILE=[path to vcpkg clone]/scripts/buildsystems/vcpkg.cmake \
-DCMAKE_BUILD_TYPE=Release
make -j$(nproc)
If using a from-source installation of Flashlight, Flashlight will be found automatically by CMake:
cd project && mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make -j$(nproc)
If Flashlight is installed in a custom location using a CMAKE_INSTALL_PREFIX
, passing -Dflashlight_DIR=[install prefix]/share/flashlight/cmake
as an argument to your cmake
command can help CMake find Flashlight.
Flashlight and its dependencies can also be built with the provided Dockerfiles — see the accompanying Docker documentation for more information.
Contact: vineelkpratap@fb.com, awni@fb.com, jacobkahn@fb.com, qiantong@fb.com, antares@fb.com, padentomasello@fb.com, jcai@fb.com, gab@fb.com, vitaliy888@fb.com, locronan@fb.com
Flashlight is being very actively developed. See CONTRIBUTING for more on how to help out.
Some of Flashlight's code is derived from arrayfire-ml.
Flashlight is under a BSD license. See LICENSE for more information.