Video Tutorial
How to use decorators#
Learn about the different ways to use compilation and transpilation functions.
⚠️ If you are running this notebook in Colab, you will have to install Ivy
and some dependencies manually. You can do so by running the cell below ⬇️
If you want to run the notebook locally but don’t have Ivy installed just yet, you can check out the Get Started section of the docs.
[ ]:
!pip install ivy
Unify#
Firstly, let’s create the dummy numpy
arrays as before:
[1]:
# import numpy
import numpy as np
# create random numpy arrays for testing
x = np.random.uniform(size=10)
mean = np.mean(x)
std = np.std(x)
Let’s assume that our target framework is tensorflow
:
[3]:
import ivy
import tensorflow as tf
ivy.set_backend("tensorflow")
x = tf.constant(x)
In the example below, the ivy.unify
function is called as a decorator.
[4]:
import torch
@ivy.unify(source="torch")
def normalize(x):
mean = torch.mean(x)
std = torch.std(x)
return torch.div(torch.sub(x, mean), std)
[5]:
normalize(x) # unification happens here
[5]:
ivy.array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])
The function can still be called either eagerly or lazily when calling as a decorator. The example above is lazy, whereas the example below is eager:
[6]:
@ivy.unify(source="torch", args=(x,))
def normalize(x):
mean = torch.mean(x)
std = torch.std(x)
return torch.div(torch.sub(x, mean), std)
[7]:
normalize(x) # already unified
[7]:
ivy.array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])
Compile#
In the example below, the ivy.compile
function is also called as a decorator. (Note that this is now an Ivy function!)
[12]:
@ivy.compile
def normalize(x):
mean = ivy.mean(x)
std = ivy.std(x, correction=1)
return ivy.divide(ivy.subtract(x, mean), std)
[13]:
normalize(x) # compilation happens here
[13]:
<tf.Tensor: shape=(10,), dtype=float64, numpy=
array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])>
Likewise, the function can still be called either eagerly or lazily when calling as a decorator. The example above is lazy, whereas the example below is eager:
[14]:
@ivy.compile(args=(x,))
def normalize(x):
mean = ivy.mean(x)
std = ivy.std(x, correction=1)
return ivy.divide(ivy.subtract(x, mean), std)
[15]:
normalize(x) # already compiled
[15]:
<tf.Tensor: shape=(10,), dtype=float64, numpy=
array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])>
Transpile#
In the example below, the ivy.transpile
function is called as a decorator.
[16]:
@ivy.transpile(source="torch", to="tensorflow")
def normalize(x):
mean = torch.mean(x)
std = torch.std(x)
return torch.div(torch.sub(x, mean), std)
[17]:
normalize(x) # transpilation happens here
[17]:
<tf.Tensor: shape=(10,), dtype=float64, numpy=
array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])>
The function can still be called either eagerly or lazily when calling as a decorator. The example above is lazy, whereas the example below is eager:
[18]:
@ivy.transpile(source="torch", to="tensorflow", args=(x,))
def normalize(x):
mean = torch.mean(x)
std = torch.std(x)
return torch.div(torch.sub(x, mean), std)
[19]:
normalize(x) # already transpiled
[19]:
<tf.Tensor: shape=(10,), dtype=float64, numpy=
array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])>
Round Up#
That’s it, you now know how ivy.unify
, ivy.compile
and ivy.transpile
can all be used as function decorators! Next, we’ll start exploring the transpilation of more involved objects, beginning with libraries 📚