Mish() and SiLU() in PyTorch

Buy Me a Coffee☕

*Memos:

My post explains GELU, Mish, SiLU and Softplus.

My post explains heaviside() and ReLU().

My post explains LeakyReLU() and PReLU().

My post explains ELU() and SELU().

My post explains CELU() and GELU().

Mish() can get …


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)

Buy Me a Coffee

*Memos:

Mish() can get the 0D or more D tensor of the zero or more values computed by Mish function from the 0D or more D tensor of zero or more elements as shown below:

*Memos:

  • The 1st argument for initialization is inplace(Optional-Default:False-Type:bool): *Memos:
    • It does in-place operation.
    • Keep it False because it's problematic with True.
  • The 1st argument is input(Required-Type:tensor of float).

Image description

import torch
from torch import nn

my_tensor = torch.tensor([8., -3., 0., 1., 5., -2., -1., 4.])

mish = nn.Mish()
mish(input=my_tensor)
# tensor([8.0000, -0.1456, 0.0000, 0.8651, 4.9996, -0.2525, -0.3034, 3.9974])

mish
# Mish()

mish.inplace
# False

mish = nn.Mish(inplace=True)
mish(input=my_tensor)
# tensor([8.0000, -0.1456, 0.0000, 0.8651, 4.9996, -0.2525, -0.3034, 3.9974])

my_tensor = torch.tensor([[8., -3., 0., 1.],
                          [5., -2., -1., 4.]])
mish = nn.Mish()
mish(input=my_tensor)
# tensor([[8.0000, -0.1456, 0.0000, 0.8651],
#         [4.9996, -0.2525, -0.3034, 3.9974]])

my_tensor = torch.tensor([[[8., -3.], [0., 1.]],
                          [[5., -2.], [-1., 4.]]])
mish = nn.Mish()
mish(input=my_tensor)
# tensor([[[8.0000, -0.1456], [0.0000, 0.8651]]
#         [[4.9996, -0.2525], [-0.3034, 3.9974]]])

SiLU() can get the 0D or more D tensor of the zero or more values computed by SiLU function from the 0D or more D tensor of zero or more elements as shown below:

*Memos:

  • The 1st argument for initialization is inplace(Optional-Default:False-Type:bool): *Memos:
    • It does in-place operation.
    • Keep it False because it's problematic with True.
  • The 1st argument is input(Required-Type:tensor of float).

Image description

import torch
from torch import nn

my_tensor = torch.tensor([8., -3., 0., 1., 5., -2., -1., 4.])

silu = nn.SiLU()
silu(input=my_tensor)
# tensor([7.9973, -0.1423, 0.0000, 0.7311, 4.9665, -0.2384, -0.2689, 3.9281])

silu
# SiLU()

silu.inplace
# False

silu = nn.SiLU(inplace=True)
silu(input=my_tensor)
# tensor([7.9973, -0.1423, 0.0000, 0.7311, 4.9665, -0.2384, -0.2689, 3.9281])

my_tensor = torch.tensor([[8., -3., 0., 1.],
                          [5., -2., -1., 4.]])
silu = nn.SiLU()
silu(input=my_tensor)
# tensor([[7.9973, -0.1423, 0.0000, 0.7311],
#         [4.9665, -0.2384, -0.2689, 3.9281]])

my_tensor = torch.tensor([[[8., -3.], [0., 1.]],
                          [[5., -2.], [-1., 4.]]])
silu = nn.SiLU()
silu(input=my_tensor)
# tensor([[[7.9973, -0.1423], [0.0000, 0.7311]],
#         [[4.9665, -0.2384], [-0.2689, 3.9281]]])


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)


Print Share Comment Cite Upload Translate Updates
APA

Super Kai (Kazuya Ito) | Sciencx (2024-08-15T16:15:18+00:00) Mish() and SiLU() in PyTorch. Retrieved from https://www.scien.cx/2024/08/15/mish-and-silu-in-pytorch/

MLA
" » Mish() and SiLU() in PyTorch." Super Kai (Kazuya Ito) | Sciencx - Thursday August 15, 2024, https://www.scien.cx/2024/08/15/mish-and-silu-in-pytorch/
HARVARD
Super Kai (Kazuya Ito) | Sciencx Thursday August 15, 2024 » Mish() and SiLU() in PyTorch., viewed ,<https://www.scien.cx/2024/08/15/mish-and-silu-in-pytorch/>
VANCOUVER
Super Kai (Kazuya Ito) | Sciencx - » Mish() and SiLU() in PyTorch. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/08/15/mish-and-silu-in-pytorch/
CHICAGO
" » Mish() and SiLU() in PyTorch." Super Kai (Kazuya Ito) | Sciencx - Accessed . https://www.scien.cx/2024/08/15/mish-and-silu-in-pytorch/
IEEE
" » Mish() and SiLU() in PyTorch." Super Kai (Kazuya Ito) | Sciencx [Online]. Available: https://www.scien.cx/2024/08/15/mish-and-silu-in-pytorch/. [Accessed: ]
rf:citation
» Mish() and SiLU() in PyTorch | Super Kai (Kazuya Ito) | Sciencx | https://www.scien.cx/2024/08/15/mish-and-silu-in-pytorch/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.