Neural Network Programming - Deep Learning with PyTorch

with deeplizard.

Tensors for Deep Learning - Broadcasting and Element-wise Operations with PyTorch

October 8, 2018 by

Blog

Element-wise tensor operations for deep learning

Welcome back to this series on neural network programming. In this post, we’ll be expanding our knowledge beyond reshaping operations by learning about element-wise operations.

cyborg

Without further ado, let’s get started.

  • Reshaping operations
  • Element-wise operations
  • Reduction operations
  • Access operations

What does element-wise mean?

Element-wise operations are extremely common operations with tensors in neural network programming. Let’s lead this discussion off with a definition of an element-wise operation.

An element-wise operation is an operation between two tensors that operates on corresponding elements within the respective tensors.

An element-wise operation operates on corresponding elements between tensors.

Two elements are said to be corresponding if the two elements occupy the same position within the tensor. The position is determined by the indexes used to locate each element.

Suppose we have the following two tensors:

> t1 = torch.tensor([
    [1,2],
    [3,4]
], dtype=torch.float32)

> t2 = torch.tensor([
    [9,8],
    [7,6]
], dtype=torch.float32)

Both of these tensors are rank-2 tensors with a shape of 2 x 2.

This means that we have two axes that both have a length of two elements each. The elements of the first axis are arrays and the elements of the second axis are numbers.

# Example of the first axis
> print(t1[0])
tensor([1., 2.])


# Example of the second axis
> print(t1[0][0])
tensor(1.)

This is the kind of thing we are used to seeing in this series now. Alright let's build on this.

We know that two elements are said to be corresponding if the two elements occupy the same position within the tensor, and the position is determined by the indexes used to locate each element. Let’s see an example of corresponding elements.

> t1[0][0]
tensor(1.)


> t2[0][0]
tensor(9.)

This allows us to see that the corresponding element for the 1 in t1 is the 9 in t2.

The correspondence is defined by the indexes. This is important because it reveals an important feature of element-wise operations. We can deduce that tensors must have the same number of elements in order to perform an element-wise operation.

We’ll go ahead and make this statement more restrictive. Two tensors must have the same shape in order to perform element-wise operations on them.

Addition is an element-wise operation

Let's look at our first element-wise operation, addition. Don’t worry. It will get a more interesting.

> t1 + t2
tensor([[10., 10.],
        [10., 10.]])

This allow us to see that addition between tensors is an element-wise operation. Each pair of elements in corresponding locations are added together to produce a new tensor of the same shape.

So, addition is an element-wise operation, and in fact, all the arithmetic operations, add, subtract, multiply, and divide are element-wise operations.

Arithmetic operations are element-wise operations

An operation we commonly see with tensors are arithmetic operations using scalar values. There are two ways we can do this:

(1) Using these symbolic operations:

> print(t + 2)
tensor([[3., 4.],
        [5., 6.]])

> print(t - 2)
tensor([[-1.,  0.],
        [ 1.,  2.]])

> print(t * 2)
tensor([[2., 4.],
        [6., 8.]])

> print(t / 2)
tensor([[0.5000, 1.0000],
        [1.5000, 2.0000]])

or equivalently, (2) these built-in tensor object methods:

> print(t1.add(2))
tensor([[3., 4.],
        [5., 6.]])

> print(t1.sub(2))
tensor([[-1.,  0.],
        [ 1.,  2.]])

> print(t1.mul(2))
tensor([[2., 4.],
        [6., 8.]])

> print(t1.div(2))
tensor([[0.5000, 1.0000],
        [1.5000, 2.0000]])

Both of these options work the same. We can see that in both cases, the scalar value, 2, is applied to each element with the corresponding arithmetic operation.

Something seems to be wrong here. These examples are breaking the rule we established that said element-wise operations operate on tensors of the same shape.

Scalar values are Rank-0 tensors, which means they have no shape, and our tensor t1 is a rank-2 tensor of shape 2 x 2.

So how does this fit in? Let’s break it down.

The first solution that may come to mind is that the operation is simply using the single scalar value and operating on each element within the tensor.

This logic kind of works. However, it’s a bit misleading, and it breaks down in more general situations where we’re note using a scalar.

To think about these operations differently, we need to introduce the concept of tensor broadcasting or broadcasting.

Broadcasting tensors

Broadcasting describes how tensors with different shapes are treated during element-wise operations.

Broadcasting is the concept whose implementation allows us to add scalars to higher dimensional tensors.

Let's think about the t1 + 2 operation. Here, the scaler valued tensor is being broadcasted to the shape of t1, and then, the element-wise operation is carried out.

We can see what the broadcasted scalar value looks like using the broadcast_to() Numpy function:

> np.broadcast_to(2, t1.shape)
array([[2, 2],
        [2, 2]])

This means the scalar value is transformed into a rank-2 tensor just like t1, and just like that, the shapes match and the element-wise rule of having the same shape is back in play. This is all under the hood of course.

This piece of code here paints the picture so to speak. This

> t1 + 2
tensor([[3., 4.],
        [5., 6.]])

is really this:

> t1 + torch.tensor(
    np.broadcast_to(2, t1.shape)
    ,dtype=torch.float32
)
tensor([[3., 4.],
        [5., 6.]])

At this point you may be thinking that this seems convoluted, so let's look at a trickier example to hit this point home. Suppose we have the following two tensors.

Trickier example of broadcasting

Let's look at a trickier example to hit this point home. Suppose we have the following tensor.

t1 = torch.tensor([
    [1,1],
    [1,1]
], dtype=torch.float32)

t2 = torch.tensor([2,4], dtype=torch.float32)

What will be the result of this element-wise addition operation? Is it even possible given the same shape rule for element-wise operations?

# t1 + t2 ???????

> t1.shape
torch.Size([2, 2])

> t2.shape
torch.Size([2])

Even though these two tenors have differing shapes, the element-wise operation is possible, and broadcasting is what makes the operation possible. The lower rank tensor t2 will be transformed via broadcasting to match the shape of the higher rank tensor t1, and the element-wise operation will be performed as usual.

The concept of broadcasting is the key to understanding how this operation will be carried out. As before, we can check the broadcast transformation using the broadcast_to() numpy function.

> np.broadcast_to(t2.numpy(), t1.shape)
array([[2., 4.],
        [2., 4.]], dtype=float32)

> t1 + t2
tensor([[3., 5.],
        [3., 5.]])

After broadcasting, the addition operation between these two tensors is a regular element-wise operation between tensors of the same shape.

mic

Broadcasting is a more advanced topic than the basic element-wise operations, so don’t worry if it takes longer to get comfortable with the idea.

Understanding element-wise operations and the same shape requirement provide a basis for the concept of broadcasting and why it is used.

When do we actually use broadcasting? We often need to use broadcasting when we are preprocessing our data, and especially during normalization routines.

There is a post in the TensorFlow.js series that covers broadcasting in greater detail. There is a practical example, and the algorithm for determining how a particular tensor is broadcasted is also covered, so check that out for, a deeper discussion on broadcasting.

Don’t worry about not knowing TensorFlow.js. It’s not a requirement, and I highly recommend the content there on broadcasting.

Comparison operations are element-wise

Comparison operations are also element-wise. For a given comparison operations between tensors, a new tensor of the same shape is returned with each element containing either a 0 or a 1.

  • 0 if the comparison between corresponding elements is False.
  • 1 if the comparison between corresponding elements is True.

Suppose we have the following tensor:

> t = torch.tensor([
    [0,5,0],
    [6,0,7],
    [0,8,0]
], dtype=torch.float32)

Let’s check out some of these comparison operations.

> t.eq(0)
tensor([[1, 0, 1],
        [0, 1, 0],
        [1, 0, 1]], dtype=torch.uint8)


> t.ge(0)
tensor([[1, 1, 1],
        [1, 1, 1],
        [1, 1, 1]], dtype=torch.uint8)


> t.gt(0)
tensor([[0, 1, 0],
        [1, 0, 1],
        [0, 1, 0]], dtype=torch.uint8)


> t.lt(0)
tensor([[0, 0, 0],
        [0, 0, 0],
        [0, 0, 0]], dtype=torch.uint8)

> t.le(7)
tensor([[1, 1, 1],
        [1, 1, 1],
        [1, 0, 1]], dtype=torch.uint8)

Thinking about these operations from a broadcasting perspective, we can see that the last one, t.le(7), is really this:

> t <= torch.tensor(
    np.broadcast_to(7, t.shape)
    ,dtype=torch.float32
)

tensor([[1, 1, 1],
        [1, 1, 1],
        [1, 0, 1]], dtype=torch.uint8)

and equivalently this:

> t <= torch.tensor([
    [7,7,7],
    [7,7,7],
    [7,7,7]
], dtype=torch.float32)

tensor([[1, 1, 1],
        [1, 1, 1],
        [1, 0, 1]], dtype=torch.uint8)

Element-wise operations using functions

With element-wise operations that are functions, it’s fine to assume that the function is applied to each element of the tensor.

Here are some examples:

> t.abs() 
tensor([[0., 5., 0.],
        [6., 0., 7.],
        [0., 8., 0.]])


> t.sqrt()
tensor([[0.0000, 2.2361, 0.0000],
        [2.4495, 0.0000, 2.6458],
        [0.0000, 2.8284, 0.0000]])

> t.neg()
tensor([[-0., -5., -0.],
        [-6., -0., -7.],
        [-0., -8., -0.]])

> t.neg().abs()
tensor([[0., 5., 0.],
        [6., 0., 7.],
        [0., 8., 0.]])

Some terminology

There are some other ways to refer to element-wise operations, so I just wanted to mention that all of these mean the same thing:

  • Element-wise
  • Component-wise
  • Point-wise

Just keep this in mind if you encounter any of these terms in the wild.

Wrapping up

Now, we should have a good understanding of element-wise operations and how they are applied to tensor operations for neural networks and deep learning. In the next post, we will be covering the last two categories of tensor operations:

  • Reshaping operations
  • Element-wise operations
  • Reduction operations
  • Access operations

See you in the next one!

Description

Learn about tensor broadcasting for artificial neural network programming and element-wise operations using Python, PyTorch, and NumPy. Check out the corresponding blog and other resources for this video at: http://deeplizard.com/learn/video/QscEWm0QTRY Code: https://www.patreon.com/posts/code-for-pytorch-21607032 Code files are available as a perk for the deeplizard hivemind. Check out the details regarding deeplizard perks and rewards at: http://deeplizard.com/hivemind ❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind: Ruicong Xie Support collective intelligence, and join the deeplizard hivemind! Follow deeplizard: YouTube: https://www.youtube.com/deeplizard Twitter: https://twitter.com/deeplizard Facebook: https://www.facebook.com/Deeplizard-145413762948316 Steemit: https://steemit.com/@deeplizard Instagram: https://www.instagram.com/deeplizard/ Pinterest: https://www.pinterest.com/deeplizard/ Check out products deeplizard suggests on Amazon: https://www.amazon.com/shop/deeplizard Support deeplizard by browsing with Brave: https://brave.com/dee530 Support deeplizard with crypto: Bitcoin: 1AFgm3fLTiG5pNPgnfkKdsktgxLCMYpxCN Litecoin: LTZ2AUGpDmFm85y89PFFvVR5QmfX6Rfzg3 Ether: 0x9105cd0ecbc921ad19f6d5f9dd249735da8269ef deeplizard on broadcasting: http://deeplizard.com/learn/video/6_33ulFDuCg Jeremy on broadcasting: https://youtu.be/PGC0UxakTvM?t=3141 fast.ai: http://www.fast.ai/ Recommended books on AI: The Most Human Human: What Artificial Intelligence Teaches Us About Being Alive: http://amzn.to/2GtjKqu Life 3.0: Being Human in the Age of Artificial Intelligence https://amzn.to/2H5Iau4 Playlists: Data Science - https://www.youtube.com/playlist?list=PLZbbT5o_s2xrth-Cqs_R9-us6IWk9x27z Machine Learning - https://www.youtube.com/playlist?list=PLZbbT5o_s2xq7LwI2y8_QtvuXZedL6tQU Keras - https://www.youtube.com/playlist?list=PLZbbT5o_s2xrwRnXk_yCPtnqqo4_u2YGL TensorFlow.js - https://www.youtube.com/playlist?list=PLZbbT5o_s2xr83l8w44N_g3pygvajLrJ- PyTorch - https://www.youtube.com/watch?v=v5cngxo4mIg&list=PLZbbT5o_s2xrfNyHZsM6ufI0iZENK9xgG Reinforcement Learning - https://www.youtube.com/playlist?list=PLZbbT5o_s2xoWNVdDudn51XM8lOuZ_Njv Music: Thinking Music by Kevin MacLeod Jarvic 8 by Kevin MacLeod YouTube: https://www.youtube.com/channel/UCSZXFhRIx6b0dFX3xS8L1yQ Website: http://incompetech.com/ Licensed under Creative Commons: By Attribution 3.0 License http://creativecommons.org/licenses/by/3.0/ #ai #pytorch #deeplearning