Deep Learning Dictionary - Lightweight Crash Course

Deep Learning Course - Level: Beginner

Relu Activation Function - Deep Learning Dictionary


expand_more chevron_left


expand_more chevron_left

ReLU Activation Function - Deep Learning Dictionary

In a neural network, an activation function applies a nonlinear transformation to the output of a layer.

One of the most widely used activation functions today called \(\text{ReLU}\), short for Rectified Linear Unit, transforms its input to the maximum of either \(0\) or the input itself.

For a given value \(x\) passed to \(\text{ReLU}\), we define



The table below summarizes how \(\text{ReLU}\) transforms its input.

Input ReLU Output
Values less than or equal to \(0\) \(0\)
Values greater than \(0\) The input value

When \(\text{ReLU}\) is used as an activation function following a layer in a neural network, it accepts the weighted sum of outputs from the previous layer and transforms this sum to a value equal to either the sum itself or \(0\).


Intuitively, we can think of a given node's activated output (with \(\text{ReLU}\)) as being "more activated" the more positive it is.

\(\text{ReLU}\) is by far the most popular choice for an activation function to include in neural networks today. There have been several variations of \(\text{ReLU}\) that lead to marginal improvements when training networks, like leaky \(\text{ReLU}\) and parametric \(\text{ReLU}\) (\(\text{PreLU}\)).


expand_more chevron_left
deeplizard logo DEEPLIZARD Message notifications

Quiz Results


expand_more chevron_left
What is the relu activation function used in artificial neural networks? πŸ‘‰ To gain early access to the full Deep Learning Dictionary course, register at: πŸ”— πŸ‘‰ For more in depth lessons, check out the Deep Learning Fundamentals course: πŸ”— πŸ•’πŸ¦Ž VIDEO SECTIONS πŸ¦ŽπŸ•’ 00:00 Welcome to DEEPLIZARD - Go to for learning resources 00:10 Relu Activation Function 02:16 Collective Intelligence and the DEEPLIZARD HIVEMIND πŸ’₯🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎πŸ’₯ πŸ‘‹ Hey, we're Chris and Mandy, the creators of deeplizard! πŸ‘€ CHECK OUT OUR VLOG: πŸ”— πŸ’ͺ CHECK OUT OUR FITNESS CHANNEL: πŸ”— 🧠 Use code DEEPLIZARD at checkout to receive 15% off your first Neurohacker order: πŸ”— ❀️🦎 Special thanks to the following polymaths of the deeplizard hivemind: Mano Prime πŸ‘€ Follow deeplizard: Our vlog: Fitness: Facebook: Instagram: Twitter: Patreon: YouTube: πŸŽ“ Deep Learning with deeplizard: AI Art for Beginners - Deep Learning Dictionary - Deep Learning Fundamentals - Learn TensorFlow - Learn PyTorch - Natural Language Processing - Reinforcement Learning - Generative Adversarial Networks - Stable Diffusion Masterclass - πŸŽ“ Other Courses: DL Fundamentals Classic - Deep Learning Deployment - Data Science - Trading - πŸ›’ Check out products deeplizard recommends on Amazon: πŸ”— πŸ“• Get a FREE 30-day Audible trial and 2 FREE audio books using deeplizard's link: πŸ”— 🎡 deeplizard uses music by Kevin MacLeod πŸ”— ❀️ Please use the knowledge gained from deeplizard content for good, not evil.


expand_more chevron_left
deeplizard logo DEEPLIZARD Message notifications

Update history for this page

Did you know you that deeplizard content is regularly updated and maintained?

  • Updated
  • Maintained

Spot something that needs to be updated? Don't hesitate to let us know. We'll fix it!

All relevant updates for the content on this page are listed below.