手機(jī)上怎么制作app網(wǎng)站更新seo
一、方法詳解
首先,看一下stack的直觀解釋,動(dòng)詞可以簡(jiǎn)單理解為:把……放成一堆、把……放成一摞。
有了對(duì)stack方法的直觀感受,接下來(lái),我們正式解析torch.stack方法。
PyTorch torch.stack() method joins (concatenates) a sequence of tensors (two or more tensors) along a new dimension. It inserts new dimension and concatenates the tensors along that dimension. This method joins the tensors with the same dimensions and shape. We could also use torch.cat() to join tensors But here we discuss the torch.stack() method.
torch.stack方法用于沿著一個(gè)新的維度 join(也可稱為cat)一系列的張量(可以是2個(gè)張量或者是更多),它會(huì)插入一個(gè)新的維度,并讓張量按照這個(gè)新的維度進(jìn)行張量的cat操作。值得注意的是:張量序列中的張量必須要有相同的shape和dimension。
Parameters
tensors:張量序列,也就是要進(jìn)行stack操作的對(duì)象們,可以有很多個(gè)張量。
dim:按照dim的方式對(duì)這些張量進(jìn)行stack操作,也就是你要按照哪種堆疊方式對(duì)張量進(jìn)行堆疊。dim的取值范圍為閉區(qū)間[0,輸入Tensor的維數(shù)]
return
堆疊后的張量
只通過(guò)理論對(duì)方法進(jìn)行解釋說(shuō)明是不夠直觀的,下面會(huì)通過(guò)大量的示例對(duì)torch.stack方法進(jìn)行解析!
二、案例解析
2.1 案例1:2個(gè)一維tensor進(jìn)行stack操作
- 程序
x = t.tensor([1,2,3,4])y = t.tensor([5,6,7,8])print(x.shape)print(y.shape)z1 = t.stack((x,y), dim=0)print(z1)print(z1.shape)z2 = t.stack((x,y), dim=1)print(z2)print(z2.shape)
- 運(yùn)行結(jié)果
torch.Size([4])
torch.Size([4])tensor([[1, 2, 3, 4],[5, 6, 7, 8]])
torch.Size([2, 4])tensor([[1, 5],[2, 6],[3, 7],[4, 8]])
torch.Size([4, 2])
- 圖解
2.2 案例2:2個(gè)二維tensor進(jìn)行stack操作
- 程序
x = t.tensor([[1,2,3],[4,5,6]])y = t.tensor([[7,8,9],[10,11,12]])print(x.shape)print(y.shape)z1 = t.stack((x,y), dim=0)print(z1)print(z1.shape)z2 = t.stack((x,y), dim=1)print(z2)print(z2.shape)z3 = t.stack((x,y), dim=2)print(z3)print(z3.shape)
- 運(yùn)行結(jié)果
torch.Size([2, 3])
torch.Size([2, 3])tensor([[[ 1, 2, 3],[ 4, 5, 6]],[[ 7, 8, 9],[10, 11, 12]]])
torch.Size([2, 2, 3])tensor([[[ 1, 2, 3],[ 7, 8, 9]],[[ 4, 5, 6],[10, 11, 12]]])
torch.Size([2, 2, 3])tensor([[[ 1, 7],[ 2, 8],[ 3, 9]],[[ 4, 10],[ 5, 11],[ 6, 12]]])
torch.Size([2, 3, 2])
- 圖解
2.3 案例3:多個(gè)二維tensor進(jìn)行stack操作
- 程序
x = torch.tensor([[1,2,3],[4,5,6]])
y = torch.tensor([[7,8,9],[10,11,12]])
z = torch.tensor([[13,14,15],[16,17,18]])
print(x.shape)
print(y.shape)
print(z.shape)r1 = torch.stack((x,y,z),dim=0)
print(r1)
print(r1.shape)r2 = torch.stack((x,y,z),dim=1)
print(r2)
print(r2.shape)r3 = torch.stack((x,y,z),dim=2)
print(r3)
print(r3.shape)
- 運(yùn)行結(jié)果
torch.Size([2, 3])
torch.Size([2, 3])
torch.Size([2, 3])tensor([[[ 1, 2, 3],[ 4, 5, 6]],[[ 7, 8, 9],[10, 11, 12]],[[13, 14, 15],[16, 17, 18]]])
torch.Size([3, 2, 3])tensor([[[ 1, 2, 3],[ 7, 8, 9],[13, 14, 15]],[[ 4, 5, 6],[10, 11, 12],[16, 17, 18]]])
torch.Size([2, 3, 3])tensor([[[ 1, 7, 13],[ 2, 8, 14],[ 3, 9, 15]],[[ 4, 10, 16],[ 5, 11, 17],[ 6, 12, 18]]])
torch.Size([2, 3, 3])
- 圖解
2.4 案例4:2個(gè)三維tensor進(jìn)行stack操作
- 程序
x = torch.tensor([[[1,2,3],[4,5,6]],[[2,3,4],[5,6,7]]])
y = torch.tensor([[[7,8,9],[10,11,12]],[[8,9,10],[11,12,13]]])
print(x.shape)
print(y.shape)
z1 = torch.stack((x,y),dim=0)
print(z1)
print(z1.shape)
z2 = torch.stack((x,y),dim=1)
print(z2)
print(z2.shape)
z3 = torch.stack((x,y),dim=2)
print(z3)
print(z3.shape)
z4 = torch.stack((x,y),dim=3)
print(z4)
print(z4.shape)
- 運(yùn)行結(jié)果
torch.Size([2, 2, 3])
torch.Size([2, 2, 3])tensor([[[[ 1, 2, 3],[ 4, 5, 6]],[[ 2, 3, 4],[ 5, 6, 7]]],[[[ 7, 8, 9],[10, 11, 12]],[[ 8, 9, 10],[11, 12, 13]]]])
torch.Size([2, 2, 2, 3])tensor([[[[ 1, 2, 3],[ 4, 5, 6]],[[ 7, 8, 9],[10, 11, 12]]],[[[ 2, 3, 4],[ 5, 6, 7]],[[ 8, 9, 10],[11, 12, 13]]]])
torch.Size([2, 2, 2, 3])tensor([[[[ 1, 2, 3],[ 7, 8, 9]],[[ 4, 5, 6],[10, 11, 12]]],[[[ 2, 3, 4],[ 8, 9, 10]],[[ 5, 6, 7],[11, 12, 13]]]])
torch.Size([2, 2, 2, 3])tensor([[[[ 1, 7],[ 2, 8],[ 3, 9]],[[ 4, 10],[ 5, 11],[ 6, 12]]],[[[ 2, 8],[ 3, 9],[ 4, 10]],[[ 5, 11],[ 6, 12],[ 7, 13]]]])
torch.Size([2, 2, 3, 2])
- 圖解
參考文獻(xiàn)
[1]https://blog.csdn.net/flyingluohaipeng/article/details/125034358
[2]https://www.geeksforgeeks.org/python-pytorch-stack-method/
[3]https://www.bing.com/search?q=torch.stack&form=ANNTH1&refig=653766bda2d540398dfb83d482cd33cd