repeat()、tile() 在numpy和torch中的使用辨析

Tensor.repeat()

对张量进行重复扩充
WARNING
repeat() behaves differently from numpy.repeat,but is more similar to numpy.tile.

import torch
a = torch.tensor([[1, 2], [3, 4], [5, 6]])      # [3, 2]
a = a.repeat((2, 2))
print(a, a.shape)

tensor([[1, 2, 1, 2],
        [3, 4, 3, 4],
        [5, 6, 5, 6],
        [1, 2, 1, 2],
        [3, 4, 3, 4],
        [5, 6, 5, 6]]) torch.Size([6, 4])

先将维度为1的 ‘2’ 整体扩充2倍,再将维度为0的 ‘3’ 整体扩充2倍

torch.tile()

效果和 Tensor.repeat() 一样
NOTE:
This function is similar to NumPy’s tile function.

a = torch.tensor([[1, 2], [3, 4], [5, 6]])      # [3, 2]
a = torch.tile(a, dims=(2, 2))
print(a, a.shape)

tensor([[1, 2, 1, 2],
        [3, 4, 3, 4],
        [5, 6, 5, 6],
        [1, 2, 1, 2],
        [3, 4, 3, 4],
        [5, 6, 5, 6]]) torch.Size([6, 4])

torch.repeat_interleave()

重复张量中的每个元素
WARNING
This is different from torch.Tensor.repeat() but similar to numpy.repeat.

a = torch.tensor([[1, 2], [3, 4], [5, 6]])      # 3, 2
b = torch.repeat_interleave(a, repeats=2, dim=0)
>>> tensor([[1, 2],
        [1, 2],
        [3, 4],
        [3, 4],
        [5, 6],
        [5, 6]]) torch.Size([6, 2])

c = torch.repeat_interleave(a, repeats=2, dim=1)
>>> tensor([[1, 1, 2, 2],
        [3, 3, 4, 4],
        [5, 5, 6, 6]]) torch.Size([3, 2])

你可能感兴趣的:(pytorch,pytorch,深度学习,python)