This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)
*Memos:
- My post explains GELU, Mish, SiLU and Softplus.
- My post explains heaviside() and ReLU().
- My post explains LeakyReLU() and PReLU().
- My post explains ELU() and SELU().
- My post explains CELU() and GELU().
Mish() can get the 0D or more D tensor of the zero or more values computed by Mish function from the 0D or more D tensor of zero or more elements as shown below:
*Memos:
- The 1st argument for initialization is
inplace
(Optional-Default:False
-Type:bool
): *Memos:- It does in-place operation.
- Keep it
False
because it's problematic withTrue
.
- The 1st argument is
input
(Required-Type:tensor
offloat
).
import torch
from torch import nn
my_tensor = torch.tensor([8., -3., 0., 1., 5., -2., -1., 4.])
mish = nn.Mish()
mish(input=my_tensor)
# tensor([8.0000, -0.1456, 0.0000, 0.8651, 4.9996, -0.2525, -0.3034, 3.9974])
mish
# Mish()
mish.inplace
# False
mish = nn.Mish(inplace=True)
mish(input=my_tensor)
# tensor([8.0000, -0.1456, 0.0000, 0.8651, 4.9996, -0.2525, -0.3034, 3.9974])
my_tensor = torch.tensor([[8., -3., 0., 1.],
[5., -2., -1., 4.]])
mish = nn.Mish()
mish(input=my_tensor)
# tensor([[8.0000, -0.1456, 0.0000, 0.8651],
# [4.9996, -0.2525, -0.3034, 3.9974]])
my_tensor = torch.tensor([[[8., -3.], [0., 1.]],
[[5., -2.], [-1., 4.]]])
mish = nn.Mish()
mish(input=my_tensor)
# tensor([[[8.0000, -0.1456], [0.0000, 0.8651]]
# [[4.9996, -0.2525], [-0.3034, 3.9974]]])
SiLU() can get the 0D or more D tensor of the zero or more values computed by SiLU function from the 0D or more D tensor of zero or more elements as shown below:
*Memos:
- The 1st argument for initialization is
inplace
(Optional-Default:False
-Type:bool
): *Memos:- It does in-place operation.
- Keep it
False
because it's problematic withTrue
.
- The 1st argument is
input
(Required-Type:tensor
offloat
).
import torch
from torch import nn
my_tensor = torch.tensor([8., -3., 0., 1., 5., -2., -1., 4.])
silu = nn.SiLU()
silu(input=my_tensor)
# tensor([7.9973, -0.1423, 0.0000, 0.7311, 4.9665, -0.2384, -0.2689, 3.9281])
silu
# SiLU()
silu.inplace
# False
silu = nn.SiLU(inplace=True)
silu(input=my_tensor)
# tensor([7.9973, -0.1423, 0.0000, 0.7311, 4.9665, -0.2384, -0.2689, 3.9281])
my_tensor = torch.tensor([[8., -3., 0., 1.],
[5., -2., -1., 4.]])
silu = nn.SiLU()
silu(input=my_tensor)
# tensor([[7.9973, -0.1423, 0.0000, 0.7311],
# [4.9665, -0.2384, -0.2689, 3.9281]])
my_tensor = torch.tensor([[[8., -3.], [0., 1.]],
[[5., -2.], [-1., 4.]]])
silu = nn.SiLU()
silu(input=my_tensor)
# tensor([[[7.9973, -0.1423], [0.0000, 0.7311]],
# [[4.9665, -0.2384], [-0.2689, 3.9281]]])
This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)
Super Kai (Kazuya Ito) | Sciencx (2024-08-15T16:15:18+00:00) Mish() and SiLU() in PyTorch. Retrieved from https://www.scien.cx/2024/08/15/mish-and-silu-in-pytorch/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.