stack() in PyTorch

*Memos:

My post explains cat().

My post explains hstack(), vstack(), dstack() and column_stack().

stack() can get the 1D or more D concatenated tensor of zero or more elements with one additional dimension from the one or more 0D or more D tensor…


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)

*Memos:

stack() can get the 1D or more D concatenated tensor of zero or more elements with one additional dimension from the one or more 0D or more D tensors of zero or more elements as shown below:

*Memos:

  • stack() can be used with torch but not with a tensor.
  • The 1st argument with torch is tensors(Required-Type:tuple or list of tensor of int, float, complex or bool). *The size of tensors must be the same.
  • The 2nd argument with torch is dim(Optional-Default:0-Type:int).
  • There is out argument with torch(Optional-Type:tensor): *Memos:
    • out= must be used.
    • My post explains out argument.
  • tensors+1D tensor is returned.
import torch

tensor1 = torch.tensor(2)
tensor2 = torch.tensor(7)
tensor3 = torch.tensor(4)

torch.stack(tensors=(tensor1, tensor2, tensor3))
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=0)
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=-1)
# tensor([2, 7, 4])

tensor1 = torch.tensor([2, 7, 4])
tensor2 = torch.tensor([8, 3, 2])
tensor3 = torch.tensor([5, 0, 8])

torch.stack(tensors=(tensor1, tensor2, tensor3))
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=0)
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=-2)
# tensor([[2, 7, 4], [8, 3, 2], [5, 0, 8]])

torch.stack(tensors=(tensor1, tensor2, tensor3), dim=1)
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=-1)
# tensor([[2, 8, 5], [7, 3, 0], [4, 2, 8]])

tensor1 = torch.tensor([[2, 7, 4], [8, 3, 2]])
tensor2 = torch.tensor([[5, 0, 8], [3, 6, 1]])
tensor3 = torch.tensor([[9, 4, 7], [1, 0, 5]])

torch.stack(tensors=(tensor1, tensor2, tensor3))
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=0)
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=-3)
# tensor([[[2, 7, 4], [8, 3, 2]],
#         [[5, 0, 8], [3, 6, 1]],
#         [[9, 4, 7], [1, 0, 5]]])

torch.stack(tensors=(tensor1, tensor2, tensor3), dim=1)
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=-2)
# tensor([[[2, 7, 4], [5, 0, 8], [9, 4, 7]],
#         [[8, 3, 2], [3, 6, 1], [1, 0, 5]]])

torch.stack(tensors=(tensor1, tensor2, tensor3), dim=-1)
torch.stack(tensors=(tensor1, tensor2, tensor3), dim=2)
# tensor([[[2, 5, 9], [7, 0, 4], [4, 8, 7]],
#         [[8, 3, 1], [3, 6, 0], [2, 1, 5]]])

tensor1 = torch.tensor([[2., 7., 4.], [8., 3., 2.]])
tensor2 = torch.tensor([[5., 0., 8.], [3., 6., 1.]])
tensor3 = torch.tensor([[9., 4., 7.], [1., 0., 5.]])

torch.stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[[2., 7., 4.], [8., 3., 2.]],
#         [[5., 0., 8.], [3., 6., 1.]],
#         [[9., 4., 7.], [1., 0., 5.]]])

tensor1 = torch.tensor([[2.+0.j, 7.+0.j, 4.+0.j],
                        [8.+0.j, 3.+0.j, 2.+0.j]])
tensor2 = torch.tensor([[5.+0.j, 0.+0.j, 8.+0.j],
                        [3.+0.j, 6.+0.j, 1.+0.j]])
tensor3 = torch.tensor([[9.+0.j, 4.+0.j, 7.+0.j],
                        [1.+0.j, 0.+0.j, 5.+0.j]])
torch.stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[[2.+0.j, 7.+0.j, 4.+0.j],
#          [8.+0.j, 3.+0.j, 2.+0.j]],
#         [[5.+0.j, 0.+0.j, 8.+0.j],
#          [3.+0.j, 6.+0.j, 1.+0.j]],
#         [[9.+0.j, 4.+0.j, 7.+0.j],
#          [1.+0.j, 0.+0.j, 5.+0.j]]])

tensor1 = torch.tensor([[True, False, True], [False, True, False]])
tensor2 = torch.tensor([[False, True, False], [True, False, True]])
tensor3 = torch.tensor([[True, False, True], [False, True, False]])

torch.stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[[True, False, True], [False, True, False]],
#         [[False, True, False], [True, False, True]],
#         [[True, False, True], [False, True, False]]])


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)


Print Share Comment Cite Upload Translate Updates
APA

Super Kai (Kazuya Ito) | Sciencx (2024-07-14T03:09:37+00:00) stack() in PyTorch. Retrieved from https://www.scien.cx/2024/07/14/stack-in-pytorch/

MLA
" » stack() in PyTorch." Super Kai (Kazuya Ito) | Sciencx - Sunday July 14, 2024, https://www.scien.cx/2024/07/14/stack-in-pytorch/
HARVARD
Super Kai (Kazuya Ito) | Sciencx Sunday July 14, 2024 » stack() in PyTorch., viewed ,<https://www.scien.cx/2024/07/14/stack-in-pytorch/>
VANCOUVER
Super Kai (Kazuya Ito) | Sciencx - » stack() in PyTorch. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/07/14/stack-in-pytorch/
CHICAGO
" » stack() in PyTorch." Super Kai (Kazuya Ito) | Sciencx - Accessed . https://www.scien.cx/2024/07/14/stack-in-pytorch/
IEEE
" » stack() in PyTorch." Super Kai (Kazuya Ito) | Sciencx [Online]. Available: https://www.scien.cx/2024/07/14/stack-in-pytorch/. [Accessed: ]
rf:citation
» stack() in PyTorch | Super Kai (Kazuya Ito) | Sciencx | https://www.scien.cx/2024/07/14/stack-in-pytorch/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.