I have a question about torch.stack
I have 2 tensors, a.shape=(2, 3, 4) and b.shape=(2, 3). How to stack them without in-place operation?
Using pytorch 1.2 or 1.4 arjoonn's answer did not work for me.
Instead of torch.stack
I have used torch.cat
with pytorch 1.2 and 1.4:
>>> import torch
>>> a = torch.randn([2, 3, 4])
>>> b = torch.randn([2, 3])
>>> b = b.unsqueeze(dim=2)
>>> b.shape
torch.Size([2, 3, 1])
>>> torch.cat([a, b], dim=2).shape
torch.Size([2, 3, 5])
If you want to use torch.stack
the dimensions of the tensors have to be the same:
>>> a = torch.randn([2, 3, 4])
>>> b = torch.randn([2, 3, 4])
>>> torch.stack([a, b]).shape
torch.Size([2, 2, 3, 4])
Here is another example:
>>> t = torch.tensor([1, 1, 2])
>>> stacked = torch.stack([t, t, t], dim=0)
>>> t.shape, stacked.shape, stacked
(torch.Size([3]),
torch.Size([3, 3]),
tensor([[1, 1, 2],
[1, 1, 2],
[1, 1, 2]]))
With stack
you have the dim
parameter which lets you specify on which dimension you stack the tensors with equal dimensions.