Python_PyTorch_Tensor

本文详细介绍了如何在PyTorch中创建、操作张量,包括标量操作、向量和矩阵的点积、索引元素、元素替换以及重塑张量的维度,适合初学者了解基本的张量操作。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

PyTorch - 张量

创建张量
向量的标量操作
向量的点积
矩阵(矩阵乘法)的点积
索引张量元素
替换元素
重塑维度

注意:在这个教程中,我使用的Pytorch版本如下。

>>> print(torch.__version__)
1.0.1

创建张量
最简单的张量是一个标量,即单个数字。有多种方法可以创建标量类型的张量。

t1 = torch.tensor(3)
t2 = torch.tensor(3.)
t3 = torch.tensor(3.0)
t4 = torch.tensor(3, dtype=torch.float64)

t1, t2, t3, t4 都存储一个数字3,但数据类型(即,存储数字的内存大小)是不同的。你可以通过打印这些张量的类型来检查这一点。

print(t1.dtype, t2.dtype, t3.dtype, t4.dtype)
==> torch.int64 torch.float32 torch.float32 torch.float64

接下来,你可以按照以下方式创建向量或矩阵类型的张量。

t1 = torch.tensor([1,2,3])
t2 = torch.tensor([4,5,6]);
t3 = torch.tensor([[1,2,4], [4,5,6]]);

标量操作向量

t1 = torch.tensor([1,2,3])
t2 = torch.tensor([4,5,6]);

t3 = t1+t2
print(t3)
==> tensor([5, 7, 9])

t4 = t1 * t2;
print(t4)
==> tensor([ 4, 10, 18])

向量的点积

t1 = torch.tensor([1,2,3])
t2 = torch.tensor([4,5,6]);
t3 = torch.tensor([[1,2,4], [4,5,6]]);

t4 = torch.dot(t1,t2) # 注意,它不需要转置其中一个向量
print(t4)
==> tensor(32)

矩阵(矩阵乘法)的点积

t1 = torch.tensor([1,2,3])

t2 = torch.tensor([4,5,6]);

t3 = torch.tensor([[1,2,4],

                         [4,5,6]]);

t4 = torch.tensor([[1,2,4],

                         [4,5,6],

                         [7,8,9]]);

 

t5 = torch.mm(t3,t3.t())

print(t5)

==> tensor([[21, 38],

                 [38, 77]])

 

t6 = torch.mm(t3.t(),t3)

print(t6)

==> tensor([[17, 22, 28],

                 [22, 29, 38],

                 [28, 38, 52]])

 

t7 = torch.mm(t4,t4)

print(t7)

==> tensor([[ 37,  44,  52],

                 [ 66,  81, 100],

                 [102, 126, 157]])

 

t8 = torch.mm(t4.t(),t4)

print(t8)

==> tensor([[ 66,  78,  91],

                 [ 78,  93, 110],

                 [ 91, 110, 133]])

 

 

 

Indexing Tensor Element

 

t1 = torch.tensor([1,2,3])

t2 = torch.tensor([4,5,6]);

t3 = torch.tensor([[1,2,3],

                         [4,5,6]]);

t4 = torch.tensor([[1,2,3],

                         [4,5,6],

                        [7,8,9]]);

t5 = torch.tensor([[1,2,3,4,5],

                   [6,7,8,9,10],

                   [11,12,13,14,15]]);

 

print(t1[1])

    ==> tensor(2)

 

print(t3[0,1])

   ==> tensor(2)

 

print(t3[1,0])

   ==> tensor(4)

 

print(t3[0])

   ==> tensor([1, 2, 3])

 

print(t3[0,:])

   ==> tensor([1, 2, 3])

 

print(t3[:,1])

   ==> tensor([2, 5])

 

print(t5[:,[1,3]])

  ==> tensor([[ 2,  4],

                   [ 7,  9],

                  [12, 14]])

 

print(t5[[1,2],:])

  ==> tensor([[ 6,  7,  8,  9, 10],

                   [11, 12, 13, 14, 15]])

 

print(t5[:,torch.arange(0,3)])

  ==> tensor([[ 1,  2,  3],

                   [ 6,  7,  8],

                   [11, 12, 13]])

 

 

 

Replacing Elements

 

t1 = torch.tensor([1,2,3])

t2 = torch.tensor([4,5,6]);

t3 = torch.tensor([[1,2,4],

                        [4,5,6]]);

t4 = torch.tensor([[1,2,3],

                         [4,5,6],

                         [7,8,9]]);

t5 = torch.tensor([[1,2,3,4,5],

                         [6,7,8,9,10],

                         [11,12,13,14,15]]);

 

 

t1[1] = 10;

print(t1)

   ==> tensor([ 1, 10,  3])

 

t3[0,1] = 10;

print(t3)

  ==> tensor([[ 1, 10,  4],

                  [ 4,  5,  6]])

 

t3[1,0] = 10;

print(t3)

   ==> tensor([[ 1, 10,  4],

                   [10,  5,  6]])

 

t3[0] = torch.tensor([10,11,12]);

print(t3)

   ==> tensor([[10, 11, 12],

                   [10,  5,  6]])

 

t3[1] = torch.arange(5,8);

print(t3)

   ==> tensor([[10, 11, 12],

                   [ 5,  6,  7]])

 

t4[0,:] = torch.tensor([10,11,12]);

print(t4)

   ==> tensor([[10, 11, 12],

                     [ 4,  5,  6],

                     [ 7,  8,  9]])

 

 

 

Reshaping Dimension

 

t1 = torch.tensor([1,2,3,4,5,6,7,8,9,10,11,12]);

t2 = torch.tensor([[1,2,3,4,5],

                   [6,7,8,9,10],

                   [11,12,13,14,15]]);

 

 

t3 = t1.view(3,4);

print(t3)

   ==> tensor([[ 1,  2,  3,  4],

                    [ 5,  6,  7,  8],

                    [ 9, 10, 11, 12]])

 

t4 = t1.view(4,3);

print(t4)

   ==> tensor([[ 1,  2,  3],

                    [ 4,  5,  6],

                    [ 7,  8,  9],

                   [10, 11, 12]])

 

t5 = t2.view(1,15);

print(t5)

   ==> tensor([[ 1,  2,  3,  4,  5,  6,  7,  8,  9, 10, 11, 12, 13, 14, 15]])

 

t6 = t2.view(1,-1);

print(t6)

   ==> tensor([[ 1,  2,  3,  4,  5,  6,  7,  8,  9, 10, 11, 12, 13, 14, 15]])

 

t7 = t2.view(-1,1);

print(t7)

   ==> tensor([[ 1],

                    [ 2],

                    [ 3],

                    [ 4],

                    [ 5],

                    [ 6],

                    [ 7],

                    [ 8],

                    [ 9],

                    [10],

                    [11],

                    [12],

                    [13],

                    [14],

                    [15]])

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值