GELU() and Mish() in PyTorch

Buy Me a Coffee☕

*Memos:

My post explains ELU, SELU and CELU.

My post explains heaviside() and Identity().

My post explains ReLU() and LeakyReLU().

My post explains PReLU() and ELU().

My post explains SELU() and CELU().

My post explains SiLU()…


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)

Buy Me a Coffee

*Memos:

GELU() can get the 0D or more D tensor of the zero or more values computed by GELU function from the 0D or more D tensor of zero or more elements as shown below:

*Memos:

  • The 1st argument for initialization is approximate(Optional-Default:'none'-Type:str): *Memos:
    • 'none' or 'tanh' can be selected.
    • The results of 'none' or 'tanh' are almost the same.
  • The 1st argument is input(Required-Type:tensor of float).

Image description

import torch
from torch import nn

my_tensor = torch.tensor([8., -3., 0., 1., 5., -2., -1., 4.])

gelu = nn.GELU()
gelu(input=my_tensor)
# tensor([8.0000e+00, -4.0499e-03, 0.0000e+00, 8.4134e-01,
#         5.0000e+00, -4.5500e-02, -1.5866e-01, 3.9999e+00])

gelu
# GELU(approximate='none')

gelu.approximate
# False

gelu = nn.GELU(approximate='tanh')
gelu(input=my_tensor)
# tensor([8.0000e+00, -3.6374e-03, 0.0000e+00, 8.4119e-01,
#         5.0000e+00, -4.5402e-02, -1.5881e-01, 3.9999e+00])

my_tensor = torch.tensor([[8., -3., 0., 1.],
                          [5., -2., -1., 4.]])
gelu = nn.GELU()
gelu(input=my_tensor)
# tensor([[8.0000e+00, -4.0499e-03, 0.0000e+00, 8.4134e-01],
#         [5.0000e+00, -4.5500e-02, -1.5866e-01, 3.9999e+00]])

my_tensor = torch.tensor([[[8., -3.], [0., 1.]],
                          [[5., -2.], [-1., 4.]]])
gelu = nn.GELU()
gelu(input=my_tensor)
# tensor([[[8.0000e+00, -4.0499e-03], [0.0000e+00, 8.4134e-01]],
#         [[5.0000e+00, -4.5500e-02], [-1.5866e-01, 3.9999e+00]]])

Mish() can get the 0D or more D tensor of the zero or more values computed by Mish function from the 0D or more D tensor of zero or more elements as shown below:

*Memos:

  • The 1st argument for initialization is inplace(Optional-Default:False-Type:bool): *Memos:
    • It does in-place operation.
    • Keep it False because it's problematic with True.
  • The 1st argument is input(Required-Type:tensor of float).

Image description

import torch
from torch import nn

my_tensor = torch.tensor([8., -3., 0., 1., 5., -2., -1., 4.])

mish = nn.Mish()
mish(input=my_tensor)
# tensor([8.0000, -0.1456, 0.0000, 0.8651, 4.9996, -0.2525, -0.3034, 3.9974])

mish
# Mish()

mish.inplace
# False

mish = nn.Mish(inplace=True)
mish(input=my_tensor)
# tensor([8.0000, -0.1456, 0.0000, 0.8651, 4.9996, -0.2525, -0.3034, 3.9974])

my_tensor = torch.tensor([[8., -3., 0., 1.],
                          [5., -2., -1., 4.]])
mish = nn.Mish()
mish(input=my_tensor)
# tensor([[8.0000, -0.1456, 0.0000, 0.8651],
#         [4.9996, -0.2525, -0.3034, 3.9974]])

my_tensor = torch.tensor([[[8., -3.], [0., 1.]],
                          [[5., -2.], [-1., 4.]]])
mish = nn.Mish()
mish(input=my_tensor)
# tensor([[[8.0000, -0.1456], [0.0000, 0.8651]]
#         [[4.9996, -0.2525], [-0.3034, 3.9974]]])


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)


Print Share Comment Cite Upload Translate Updates
APA

Super Kai (Kazuya Ito) | Sciencx (2024-08-16T04:46:10+00:00) GELU() and Mish() in PyTorch. Retrieved from https://www.scien.cx/2024/08/16/gelu-and-mish-in-pytorch/

MLA
" » GELU() and Mish() in PyTorch." Super Kai (Kazuya Ito) | Sciencx - Friday August 16, 2024, https://www.scien.cx/2024/08/16/gelu-and-mish-in-pytorch/
HARVARD
Super Kai (Kazuya Ito) | Sciencx Friday August 16, 2024 » GELU() and Mish() in PyTorch., viewed ,<https://www.scien.cx/2024/08/16/gelu-and-mish-in-pytorch/>
VANCOUVER
Super Kai (Kazuya Ito) | Sciencx - » GELU() and Mish() in PyTorch. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/08/16/gelu-and-mish-in-pytorch/
CHICAGO
" » GELU() and Mish() in PyTorch." Super Kai (Kazuya Ito) | Sciencx - Accessed . https://www.scien.cx/2024/08/16/gelu-and-mish-in-pytorch/
IEEE
" » GELU() and Mish() in PyTorch." Super Kai (Kazuya Ito) | Sciencx [Online]. Available: https://www.scien.cx/2024/08/16/gelu-and-mish-in-pytorch/. [Accessed: ]
rf:citation
» GELU() and Mish() in PyTorch | Super Kai (Kazuya Ito) | Sciencx | https://www.scien.cx/2024/08/16/gelu-and-mish-in-pytorch/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.