Weβre on a mission to unify all ML frameworks π₯ + automate code conversions π. pip install ivy-core π, join our growing community π, and lets-unify.ai! π¦Ύ
Ivy is an ML framework which currently supports JAX, TensorFlow, PyTorch, MXNet and Numpy. Weβre very excited for you to try it out!
Next on our road-map is to support automatic code conversions between any frameworks π, and add instant multi-framework support for all open-source libraries with only a few lines of code changed! Read on to learn more π
The docs are split into a number of sub-pages explaining different aspects of why we created Ivy, how to use it, what weβve got planned on our road-map, and how to contribute! Click on the sub-headings to check out these pages!
We use π§ to indicate that the feature being discussed is in development. We use β to indicate that it is already implemented!
Check out the docs for more info, and check out our Google Colabs for some interactive demos!
π¨ Ivy is still at a relatively early stage of development. Expect breaking changes and sharp edges until we release version 2.0 in the next few weeks!
Ivy can be installed like so: pip install ivy-core
You can immediately use Ivy to train a neural network, using your favourite framework in the background, like so:
import ivy
class MyModel(ivy.Module):
def __init__(self):
self.linear0 = ivy.Linear(3, 64)
self.linear1 = ivy.Linear(64, 1)
ivy.Module.__init__(self)
def _forward(self, x):
x = ivy.relu(self.linear0(x))
return ivy.sigmoid(self.linear1(x))
ivy.set_framework('torch') # change to any framework!
model = MyModel()
optimizer = ivy.Adam(1e-4)
x_in = ivy.array([1., 2., 3.])
target = ivy.array([0.])
def loss_fn(v):
out = model(x_in, v=v)
return ivy.reduce_mean((out - target)**2)[0]
for step in range(100):
loss, grads = ivy.execute_with_gradients(loss_fn, model.v)
model.v = optimizer.step(model.v, grads)
print('step {} loss {}'.format(step, ivy.to_numpy(loss).item()))
print('Finished training!')
This example uses PyTorch as a backend framework, but the backend can easily be changed to your favourite framework, such as TensorFlow, JAX or MXNet.
Framework Agnostic Functions
In the example below we show how Ivy's concatenation function is compatible with tensors from different frameworks. This is the same for ALL Ivy functions. They can accept tensors from any framework and return the correct result.
import jax.numpy as jnp
import tensorflow as tf
import numpy as np
import mxnet as mx
import torch
import ivy
jax_concatted = ivy.concat((jnp.ones((1,)), jnp.ones((1,))), -1)
tf_concatted = ivy.concat((tf.ones((1,)), tf.ones((1,))), -1)
np_concatted = ivy.concat((np.ones((1,)), np.ones((1,))), -1)
mx_concatted = ivy.concat((mx.nd.ones((1,)), mx.nd.ones((1,))), -1)
torch_concatted = ivy.concat((torch.ones((1,)), torch.ones((1,))), -1)
To see a list of all Ivy methods, type ivy.
into a python command prompt and press tab
.
You should then see output like the following:
ivy.Container( ivy.general ivy.reduce_min( ivy.abs( ivy.get_device( ivy.reduce_prod( ivy.acos( ivy.get_num_dims( ivy.reduce_sum( ivy.acosh( ivy.gradient_descent_update( ivy.reductions ivy.activations ivy.gradient_image( ivy.relu( ivy.arange( ivy.gradients ivy.reshape( ivy.argmax( ivy.identity( ivy.round( ivy.argmin( ivy.image ivy.scatter_nd( ivy.array( ivy.indices_where( ivy.seed( ivy.asin( ivy.inv( ivy.shape( ivy.asinh( ivy.layers ivy.shuffle( ivy.atan( ivy.leaky_relu( ivy.sigmoid( ivy.atan2( ivy.linalg ivy.sin( ivy.atanh( ivy.linear( ivy.sinh( ivy.bilinear_resample( ivy.linspace( ivy.softmax( ivy.cast( ivy.log( ivy.softplus( ivy.ceil( ivy.logic ivy.split( ivy.clip( ivy.logical_and( ivy.squeeze( ivy.concatenate( ivy.logical_not( ivy.stack( ivy.container ivy.logical_or( ivy.stack_images( ivy.conv2d( ivy.math ivy.stop_gradient( ivy.core ivy.matmul( ivy.svd( ivy.cos( ivy.maximum( ivy.tan( ivy.cosh( ivy.minimum( ivy.tanh( ivy.cross( ivy.neural_net ivy.tile( ivy.cumsum( ivy.nn ivy.to_list( ivy.depthwise_conv2d( ivy.norm( ivy.to_numpy( ivy.dtype( ivy.one_hot( ivy.transpose( ivy.execute_with_gradients( ivy.ones( ivy.unstack( ivy.exp( ivy.ones_like( ivy.variable( ivy.expand_dims( ivy.pinv( ivy.vector_to_skew_symmetric_matrix( ivy.flip( ivy.randint( ivy.verbosity ivy.floor( ivy.random ivy.where( ivy.floormod( ivy.random_uniform( ivy.zero_pad( ivy.backend_handler ivy.reduce_max( ivy.zeros( ivy.gather_nd( ivy.reduce_mean( ivy.zeros_like(
ivy.Trainer
, ivy.Dataset
, ivy.Dataloader
and other helpful classes and functions for creating training workflows in only a few lines of codeJoin our community as a code contributor, and help accelerate our journey to unify all ML frameworks! Find out more in our Contributing guide!
@article{lenton2021ivy, title={Ivy: Templated deep learning for inter-framework portability}, author={Lenton, Daniel and Pardo, Fabio and Falck, Fabian and James, Stephen and Clark, Ronald}, journal={arXiv preprint arXiv:2102.02886}, year={2021} }