/Merkurius

Portable Deep Learning Library for .NET

Primary LanguageC#Apache License 2.0Apache-2.0

Merkurius

This repository contains the portable deep learning (deep neural networks) library implementation for .NET platform. This library supports inference and training. Furthermore, all codes are written in C#.

Installation

You can install the Merkurius NuGet package from the .NET Core CLI command.

> dotnet add package Merkurius

or from the NuGet package manager.

PM> Install-Package Merkurius

Build

To build Merkurius, run .NET Core CLI command.

> dotnet build Merkurius.csproj

Examples

Convolutional neural network (CNN).

var model = new Model(
  new Convolution(ch, iw, ih, f, fw, fh, (fanIn, fanOut) => Initializers.HeNormal(fanIn),
  new Activation(new ReLU(),
  new MaxPooling(f, mw, mh, pw, ph,
  new FullyConnected(f * ow * oh, (fanIn, fanOut) => Initializers.HeNormal(fanIn),
  new Activation(new ReLU(),
  new FullyConnected(100, 10, (fanIn, fanOut) => Initializers.GlorotNormal(fanIn, fanOut))))))));

model.Fit(trainingList, 50, 100, new Adam(), new SoftmaxCrossEntropy());

Recurrent neural network (RNN).

var model = new Model(
  new Recurrent(1, 128, 10, true, false, (fanIn, fanOut) => Initializers.LeCunNormal(fanIn),
  new FullyConnected(128, 10, (fanIn, fanOut) => Initializers.LeCunNormal(fanIn),
  new Activation(10, new Identity()))));

model.Fit(trainingList, 50, 10, new SGD(), new MeanSquaredError());

Features

  • Inference
  • Training
  • Code first modeling
  • .NET Standard 2.1 library
  • Dependency-free

Activation Functions

  • ELU (Exponential linear unit)
  • Hyperbolic tangent
  • Identity
  • ReLU (Rectified linear unit)
  • SELU (Scaled exponential linear unit)
  • Sigmoid
  • Softmax
  • SoftPlus
  • Softsign

Layers

  • Batch normalization
  • Convolution
  • Dropout
  • Embedding
  • GRU (Gated recurrent unit)
  • Fully connected
  • LSTM (Long short-term memory)
  • Max pooling
  • Recurrent

Loss Functions

  • Cross-entropy
  • Mean squared error (MSE)

Optimizers

  • AdaDelta
  • AdaGrad
  • Adam
  • Momentum
  • Nesterov
  • RMSprop
  • SGD