Activation functions in PyTorch (1)

Buy Me a Coffee☕

*Memos:

My post explains heaviside() and Identity().

My post explains ReLU() and LeakyReLU().

My post explains Leaky ReLU, PReLU and FReLU.

My post explains ELU, SELU and CELU.

My post explains GELU, Mish, SiLU and Softplus.

M…


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)

Buy Me a Coffee

*Memos:

An activation function is the function or layer which enables neural network to learn complex(non-linear) relationships by transforming the output of the previous layer. *Without activation functions, neural network can only learn linear relationships.

(1) Step function:

  • can convert an input value(x) to 0 or 1. *If x < 0, then 0 while if x >= 0, then 1.
  • is also called Binary step function, Unit step function, Binary threshold function, Threshold function, Heaviside step function or Heaviside function.
  • is heaviside() in PyTorch.
  • 's pros:
    • It's simple, only expressing the two values 0 and 1.
    • It avoids Exploding Gradient Problem.
  • 's cons:
    • is rarely used in Deep Learning because the cons are more than other activation functions.
    • It can only express the two values 0 and 1 so the created model has bad accuracy, predicting inaccurately. *The activation functions which can express wider values can create the model of good accuracy, predicting accurately.
    • It causes Dying ReLU Problem.
    • It's non-differentiable at x = 0. *The gradient for step function doesn't exist at x = 0 during Backpropagation which does differential to calculate and get a gradient.
  • 's graph in Desmos:

Image description

(2) Identity:

  • can just return the same value as an input value(x) without any conversion.
  • 's formula is y = x.
  • is also called Linear function.
  • is Identity() in PyTorch.
  • 's pros:
    • It's simple, just returning the same value as an input value.
  • 's cons:
    • It's non-differentiable at x = 0.
  • 's graph in Desmos:

Image description

(3) ReLU(Rectified Linear Unit):

  • can convert an input value(x) to the output value between 0 and x. *If x < 0, then 0 while if 0 <= x, then x.
  • 's formula is y = max(0, x).
  • is ReLU() in PyTorch.
  • is used in:
    • Binary Classification Model.
    • Multi-Class Classification Model.
    • CNN(Convolutional Neural Network).
    • RNN(Recurrent Neural Network). *RNN in PyTorch.
    • Transformer. *Transformer() in PyTorch.
    • NLP(Natural Language Processing) based on RNN.
    • GAN(Generative Adversarial Network).
  • 's pros:
    • It mitigates Vanishing Gradient Problem.
  • 's cons:
    • It causes Dying ReLU Problem.
    • It's non-differentiable at x = 0.
  • 's graph in Desmos:

Image description


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)


Print Share Comment Cite Upload Translate Updates
APA

Super Kai (Kazuya Ito) | Sciencx (2024-10-05T18:56:10+00:00) Activation functions in PyTorch (1). Retrieved from https://www.scien.cx/2024/10/05/activation-functions-in-pytorch-1/

MLA
" » Activation functions in PyTorch (1)." Super Kai (Kazuya Ito) | Sciencx - Saturday October 5, 2024, https://www.scien.cx/2024/10/05/activation-functions-in-pytorch-1/
HARVARD
Super Kai (Kazuya Ito) | Sciencx Saturday October 5, 2024 » Activation functions in PyTorch (1)., viewed ,<https://www.scien.cx/2024/10/05/activation-functions-in-pytorch-1/>
VANCOUVER
Super Kai (Kazuya Ito) | Sciencx - » Activation functions in PyTorch (1). [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/10/05/activation-functions-in-pytorch-1/
CHICAGO
" » Activation functions in PyTorch (1)." Super Kai (Kazuya Ito) | Sciencx - Accessed . https://www.scien.cx/2024/10/05/activation-functions-in-pytorch-1/
IEEE
" » Activation functions in PyTorch (1)." Super Kai (Kazuya Ito) | Sciencx [Online]. Available: https://www.scien.cx/2024/10/05/activation-functions-in-pytorch-1/. [Accessed: ]
rf:citation
» Activation functions in PyTorch (1) | Super Kai (Kazuya Ito) | Sciencx | https://www.scien.cx/2024/10/05/activation-functions-in-pytorch-1/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.