.
.
.
Iteration: 1400 | Loss: Tensor Float [] 6.5355e-2
Iteration: 1450 | Loss: Tensor Float [] 0.1497
Iteration: 1500 | Loss: Tensor Float [] 0.1755
Iteration: 1550 | Loss: Tensor Float [] 0.1105
Iteration: 1600 | Loss: Tensor Float [] 7.6827e-2
Iteration: 1650 | Loss: Tensor Float [] 0.3088
Iteration: 1700 | Loss: Tensor Float [] 2.0256e-2
Iteration: 1750 | Loss: Tensor Float [] 0.1529
Iteration: 1800 | Loss: Tensor Float [] 9.8801e-2
Iteration: 1850 | Loss: Tensor Float [] 0.1003
#%%*****
::: %
%:
:%
#:
:%
%.
#=
:%.
=#
Model : Tensor Int64 [1] [ 7]
Ground Truth : Tensor Int64 [1] [ 7]
%%%#
%# %
. #%
:%:
%+
*%
%=
%%
%%%%++%%%=
==%%=.
Model : Tensor Int64 [1] [ 2]
Ground Truth : Tensor Int64 [1] [ 2]
.-
=
%
.#
=:
@
#
++
%:
%
Model : Tensor Int64 [1] [ 1]
Ground Truth : Tensor Int64 [1] [ 1]
%.
*%-
%%%%#
:%%+:%-
%% -%.
% .@+
% %%.
% #%*
%%%%%%
:%%%-
Model : Tensor Int64 [1] [ 0]
Ground Truth : Tensor Int64 [1] [ 0]
= +
% %
+. %
% %:
+ %
%--=*%
:: +%
=%
=%
*
Model : Tensor Int64 [1] [ 4]
Ground Truth : Tensor Int64 [1] [ 4]
%@
@:
=@
@%
@
:@
%#
@
@
+
Model : Tensor Int64 [1] [ 1]
Ground Truth : Tensor Int64 [1] [ 1]
% %
% %
+# -+
+%*::*%
:%==%+
%
++
%
%-+
*
Model : Tensor Int64 [1] [ 4]
Ground Truth : Tensor Int64 [1] [ 4]
+
%%+
.%*%%
-: *%
-#-%%.
%% =#
%
.%
#.
%
Model : Tensor Int64 [1] [ 9]
Ground Truth : Tensor Int64 [1] [ 9]
..=.
.%%%%%%
::%+:
%
%
%=
%%%%%%+
:%%%%
%%%%
%#
Model : Tensor Int64 [1] [ 5]
Ground Truth : Tensor Int64 [1] [ 5]
+%%%#
+%* .%%
:%. .#%+
%@%%%%*
+%-
-%#
%%
%%
%=
@
Model : Tensor Int64 [1] [ 9]
Ground Truth : Tensor Int64 [1] [ 9]
==:
%%**%%
.% %:
*- +#
% :#
# :#
-# +#
-# .%
# +%:
#%%%%=
Model : Tensor Int64 [1] [ 0]
Ground Truth : Tensor Int64 [1] [ 0]
Done
Hasktorch is a library for tensors and neural networks in Haskell. It is an independent open source community project which leverages the core C++ libraries shared by PyTorch.
This project is in active development, so expect changes to the library API as it evolves. We would like to invite new users to join our Hasktorch slack space for questions and discussions. Contributions/PR are encouraged.
Currently we are developing the second major release of Hasktorch (0.2). Note the 1st release, Hasktorch 0.1, on hackage is outdated and should not be used.
The documentation is divided into several sections:
- High-level MuniHac talk by @austinvhuang
- Hands-on live-coding demo by @tscholak
- Low-level FFI talk by @junjihashimoto
The following steps will get you started. They assume the hasktorch repository has just been cloned. After setup is done, read the online tutorials and API documents.
- linux+cabal+cpu
- linux+cabal+cuda11
- macos+cabal+cpu
- linux+stack+cpu
- macos+stack+cpu
- nixos+cabal+cpu
- nixos+cabal+cuda11
- docker+jupyterlab+cuda11
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh # Create a cabal project file
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ mv mnist data # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn # Run the MNIST CNN example.
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh -a cu118 # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh # Create a cabal project file
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ mv mnist data # Move the MNIST dataset to the data directory.
$ export DEVICE="cuda:0" # Set device to CUDA for the MNIST CNN example.
$ cabal run static-mnist-cnn # Run the MNIST CNN example.
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
$ ./setup-cabal.sh # Create a cabal project file
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ mv mnist data # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn # Run the MNIST CNN example.
Install the Haskell Tool Stack if you haven't already, following instructions here
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
To build and test the Hasktorch library, run:
$ stack build hasktorch # Build the Hasktorch library.
$ stack test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ stack build examples # Build the Hasktorch examples.
$ stack test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ mv mnist data # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ stack run static-mnist-cnn # Run the MNIST CNN example.
Install the Haskell Tool Stack if you haven't already, following instructions here
Starting from the top-level directory of the project, run:
$ pushd deps # Change to the deps directory and save the current directory.
$ ./get-deps.sh # Run the shell script to retrieve the libtorch dependencies.
$ popd # Go back to the root directory of the project.
$ source setenv # Set the shell environment to reference the shared library locations.
To build and test the Hasktorch library, run:
$ stack build hasktorch # Build the Hasktorch library.
$ stack test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ stack build examples # Build the Hasktorch examples.
$ stack test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ mv mnist data # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ stack run static-mnist-cnn # Run the MNIST CNN example.
(Optional) Install and set up Cachix:
$ nix-env -iA cachix -f https://cachix.org/api/v1/install # (Optional) Install Cachix.
# (Optional) Use IOHK's cache. See https://input-output-hk.github.io/haskell.nix/tutorials/getting-started/#setting-up-the-binary-cache
$ cachix use hasktorch # (Optional) Use hasktorch's cache.
Starting from the top-level directory of the project, run:
$ nix develop # Enter the nix shell environment for Hasktorch.
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ mv mnist data # Move the MNIST dataset to the data directory.
$ export DEVICE=cpu # Set device to CPU for the MNIST CNN example.
$ cabal run static-mnist-cnn # Run the MNIST CNN example.
(Optional) Install and set up Cachix:
$ nix-env -iA cachix -f https://cachix.org/api/v1/install # (Optional) Install Cachix.
# (Optional) Use IOHK's cache. See https://input-output-hk.github.io/haskell.nix/tutorials/getting-started/#setting-up-the-binary-cache
$ cachix use hasktorch # (Optional) Use hasktorch's cache.
Starting from the top-level directory of the project, run:
$ cat > nix/dev-config.nix
{
profiling = true;
cudaSupport = true;
cudaMajorVersion = "11";
}
$ nix develop # Enter the nix shell environment for Hasktorch.
To build and test the Hasktorch library, run:
$ cabal build hasktorch # Build the Hasktorch library.
$ cabal test hasktorch # Build and run the Hasktorch library test suite.
To build and test the example executables shipped with hasktorch, run:
$ cabal build examples # Build the Hasktorch examples.
$ cabal test examples # Build and run the Hasktorch example test suites.
To run the MNIST CNN example, run:
$ cd examples # Change to the examples directory.
$ ./datasets/download-mnist.sh # Download the MNIST dataset.
$ mv mnist data # Move the MNIST dataset to the data directory.
$ export DEVICE="cuda:0" # Set device to CUDA for the MNIST CNN example.
$ cabal run static-mnist-cnn # Run the MNIST CNN example.
This dockerhub repository provides the docker-image of jupyterlab. It supports cuda11, cuda10 and cpu only. When you use jupyterlab with hasktorch, type following command, then click a url in a console.
$ docker run --gpus all -it --rm -p 8888:8888 htorch/hasktorch-jupyter
or
$ docker run --gpus all -it --rm -p 8888:8888 htorch/hasktorch-jupyter:latest-cu11
In rare cases, you may see errors like
cannot move tensor to "CUDA:0"
although you have CUDA capable hardware in your machine and have followed the getting-started instructions for CUDA support.
If that happens, check if /run/opengl-driver/lib
exists.
If not, make sure your CUDA drivers are installed correctly.
If you have run cabal
in a CPU-only Hasktorch Nix shell before,
you may need to:
- Clean the
dist-newstyle
folder usingcabal clean
. - Delete the
.ghc.environment*
file in the Hasktorch root folder.
Otherwise, at best, you will not be able to move tensors to CUDA, and, at worst, you will see weird linker errors like
gcc: error: hasktorch/dist-newstyle/build/x86_64-linux/ghc-8.8.3/libtorch-ffi-1.5.0.0/build/Torch/Internal/Unmanaged/Autograd.dyn_o: No such file or directory
`cc' failed in phase `Linker'. (Exit code: 1)
We welcome new contributors.
Contact us for access to the hasktorch slack channel. You can send an email to hasktorch@gmail.com or on twitter as @austinvhuang, @SamStites, @tscholak, or @junjihashimoto3.
See the wiki for developer notes.
Basic functionality:
deps/
- submodules and downloads for build dependencies (libtorch, mklml, pytorch) -- you can ignore this if you are on Nixexamples/
- high level example models (xor mlp, typed cnn, etc.)experimental/
- experimental projects or tipshasktorch/
- higher level user-facing library, calls intoffi/
, used byexamples/
Internals (for contributing developers):
codegen/
- code generation, parsesDeclarations.yaml
spec from pytorch and producesffi/
contentsinline-c/
- submodule to inline-cpp fork used for C++ FFIlibtorch-ffi/
- low level FFI bindings to libtorchspec/
- specification files used forcodegen/