This error occurs when using the PyTorch torch.stack() function to concatenate tensors along a new dimension, but the tensors have different sizes along that dimension. In this case, the tensor at entry 0 has size [5, 34] while the tensor at entry 4 has size [4, 34].

To fix this error, you can ensure that all the tensors have the same size along the dimension you want to stack them along. One way to do this is to pad the smaller tensors with zeros to make them the same size as the largest tensor. You can use the torch.nn.functional.pad() function to do this.

For example:

import torch

tensor_list = [torch.randn(5, 34) for _ in range(5)]

max_size = max([t.size(0) for t in tensor_list])  # get the maximum size along dim 0

# pad smaller tensors with zeros to make them the same size as the largest tensor
padded_tensors = []
for t in tensor_list:
    pad_size = max_size - t.size(0)
    padded_t = torch.nn.functional.pad(t, (0, 0, 0, pad_size))
    padded_tensors.append(padded_t)

stacked_tensors = torch.stack(padded_tensors, dim=0)
print(stacked_tensors.size())  # should output [5, 5, 34]
RuntimeError stack expects each tensor to be equal size but got 5 34 at entry 0 and 4 34 at entry 4

原文地址: http://www.cveoy.top/t/topic/beg0 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录