```
!git clone https://github.com/unifyai/ivy.git
!cd ivy && git checkout d6bc18c64a47a135fe18404d9f83f98d9f3b63cf && python3 -m pip install --user -e .
```

# How to use decorators

Learn about the different ways to use compilation and transpilation functions.

⚠️ If you are running this notebook in Colab, you will have to install `Ivy`

and some dependencies manually. You can do so by running the cell below ⬇️

If you want to run the notebook locally but don’t have Ivy installed just yet, you can check out the Setting Up section of the docs.

For the installed packages to be available you will have to restart your kernel. In Colab, you can do this by clicking on **“Runtime > Restart Runtime”**. Once the runtime has been restarted you should skip the previous cell 😄

To use the compiler and the transpiler now you will need an API Key. If you already have one, you should replace the string in the next cell.

`= "PASTE_YOUR_KEY_HERE" API_KEY `

```
!mkdir -p .ivy
!echo -n $API_KEY > .ivy/key.pem
```

## Unify

Firstly, let’s create the dummy `numpy`

arrays as before:

```
# import numpy
import numpy as np
# create random numpy arrays for testing
= np.random.uniform(size=10)
x = np.mean(x)
mean = np.std(x) std
```

Let’s assume that our target framework is `tensorflow`

:

```
import ivy
import tensorflow as tf
"tensorflow")
ivy.set_backend(
= tf.constant(x) x
```

In the example below, the `ivy.unify`

function is called as a decorator.

```
import torch
@ivy.unify(source="torch")
def normalize(x):
= torch.mean(x)
mean = torch.std(x)
std return torch.div(torch.sub(x, mean), std)
```

`# unification happens here normalize(x) `

```
ivy.array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])
```

The function can still be called either *eagerly* or *lazily* when calling as a decorator. The example above is *lazy*, whereas the example below is *eager*:

```
@ivy.unify(source="torch", args=(x,))
def normalize(x):
= torch.mean(x)
mean = torch.std(x)
std return torch.div(torch.sub(x, mean), std)
```

`# already unified normalize(x) `

```
ivy.array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])
```

## Compile

In the example below, the `ivy.compile`

function is also called as a decorator. (Note that this is now an Ivy function!)

```
@ivy.compile
def normalize(x):
= ivy.mean(x)
mean = ivy.std(x, correction=1)
std return ivy.divide(ivy.subtract(x, mean), std)
```

`# compilation happens here normalize(x) `

```
<tf.Tensor: shape=(10,), dtype=float64, numpy=
array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])>
```

Likewise, the function can still be called either *eagerly* or *lazily* when calling as a decorator. The example above is *lazy*, whereas the example below is *eager*:

```
@ivy.compile(args=(x,))
def normalize(x):
= ivy.mean(x)
mean = ivy.std(x, correction=1)
std return ivy.divide(ivy.subtract(x, mean), std)
```

`# already compiled normalize(x) `

```
<tf.Tensor: shape=(10,), dtype=float64, numpy=
array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])>
```

## Transpile

In the example below, the `ivy.transpile`

function is called as a decorator.

```
@ivy.transpile(source="torch", to="tensorflow")
def normalize(x):
= torch.mean(x)
mean = torch.std(x)
std return torch.div(torch.sub(x, mean), std)
```

`# transpilation happens here normalize(x) `

```
<tf.Tensor: shape=(10,), dtype=float64, numpy=
array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])>
```

The function can still be called either *eagerly* or *lazily* when calling as a decorator. The example above is *lazy*, whereas the example below is *eager*:

```
@ivy.transpile(source="torch", to="tensorflow", args=(x,))
def normalize(x):
= torch.mean(x)
mean = torch.std(x)
std return torch.div(torch.sub(x, mean), std)
```

`# already transpiled normalize(x) `

```
<tf.Tensor: shape=(10,), dtype=float64, numpy=
array([-1.09422972, -0.46009917, 1.0881108 , 1.86487021, 0.83629996,
-1.10654466, -0.89883457, 0.02893805, 0.15644584, -0.41495672])>
```

## Round Up

That’s it, you now know how `ivy.unify`

, `ivy.compile`

and `ivy.transpile`

can all be used as function decorators! Next, we’ll start exploring the transpilation of more involved objects, beginning with libraries 📚