深度学习——线性代数(预备知识)

目录

1.标量:用只有一个元素的张量表示

2.向量:标量值组成的列表

3.矩阵:

4.三维矩阵

5.矩阵运算

6.标量和矩阵

7.计算元素的和

8.指定求和的轴

9.求平均值

10.求维度的平均值

11.计算总和或均值保持轴数不变

12.通过广播A除以sum_A

13计算某个轴的A元素累加和

14dot点积是相同位置按元素乘积然后相加的和

15.Ax是一个长度为m的列向量。向量*矩阵

16.两个矩阵乘法AB

17.求向量或者矩阵的长度 norm 平方和后开根号

18.向量元素绝对值的和

19 矩阵元素的平方和 开根号torch.norm(torch.ones((4, 9)))  # tensor(6.)


1.标量:用只有一个元素的张量表示

x = torch.tensor([3.0])
y = torch.tensor([2.0])

x + y  # tensor([5.])
x * y  # tensor([6.])
x / y  # tensor([1.5000])
x ** y  # tensor([9.])

2.向量:标量值组成的列表

x = torch.arange(4)
x  # tensor([0, 1, 2, 3])

# 通过张量的索引访问元素
x[1]  # tensor(1)
# 通过len获得长度
len(x)  # 4
# 形状:只有一个轴的张量,形状只有一个元素
x.shape  # torch.Size([4])

3.矩阵:

# 通过m,n 创建一个形状m*n的矩阵
A = torch.arange(20).reshape(5, 4)
A
'''
tensor([[ 0,  1,  2,  3],
        [ 4,  5,  6,  7],
        [ 8,  9, 10, 11],
        [12, 13, 14, 15],
        [16, 17, 18, 19]])
'''
# 矩阵转置
A.T
'''
tensor([[ 0,  4,  8, 12, 16],
        [ 1,  5,  9, 13, 17],
        [ 2,  6, 10, 14, 18],
        [ 3,  7, 11, 15, 19]])
'''
# 对称矩阵 A = A.T
B = torch.tensor([[1, 2, 3],
                  [2, 0, 4],
                  [3, 4, 5]])
B == B.T
'''
tensor([[True, True, True],
        [True, True, True],
        [True, True, True]])
'''

4.三维矩阵

x = torch.arange(24).reshape(2, 3, 4)
x
'''
tensor([[[ 0,  1,  2,  3],
         [ 4,  5,  6,  7],
         [ 8,  9, 10, 11]],

        [[12, 13, 14, 15],
         [16, 17, 18, 19],
         [20, 21, 22, 23]]])
         
x[0][1][0] 为 4
'''

5.矩阵运算

# 形状相同的两个张量,按元素二元运算形状不变
A = torch.arange(20, dtype=torch.float32).reshape(5, 4)
B = A.clone()  # 分配新内存,A复制给B
A
'''
tensor([[ 0.,  1.,  2.,  3.],
        [ 4.,  5.,  6.,  7.],
        [ 8.,  9., 10., 11.],
        [12., 13., 14., 15.],
        [16., 17., 18., 19.]])       
'''
A + B
'''
tensor([[ 0.,  2.,  4.,  6.],
        [ 8., 10., 12., 14.],
        [16., 18., 20., 22.],
        [24., 26., 28., 30.],
        [32., 34., 36., 38.]])
'''

6.标量和矩阵

a = 2
X = torch.arange(24).reshape(2, 3, 4)
a + X
'''
X的每一个元素,都加上a
tensor([[[ 2,  3,  4,  5],
         [ 6,  7,  8,  9],
         [10, 11, 12, 13]],

        [[14, 15, 16, 17],
         [18, 19, 20, 21],
         [22, 23, 24, 25]]])
'''
a * X
'''
X的每一个元素,都乘上a
tensor([[[ 0,  2,  4,  6],
         [ 8, 10, 12, 14],
         [16, 18, 20, 22]],

        [[24, 26, 28, 30],
         [32, 34, 36, 38],
         [40, 42, 44, 46]]])
'''

7.计算元素的和

x = torch.arange(4, dtype=torch.float32)
x  # tensor([0., 1., 2., 3.])
x.sum()  # tensor(6.)
# 表示任意形状张量的元素和
A = torch.arange((20 * 2), dtype=torch.float32).reshape(2, 5, 4)
A.shape  # torch.Size([2, 5, 4])
A.sum()  # tensor(780)

8.指定求和的轴

A_sum_axis0 = A.sum(axis=0)
A_sum_axis0
''' 
A
tensor([[[ 0,  1,  2,  3],
         [ 4,  5,  6,  7],
         [ 8,  9, 10, 11],
         [12, 13, 14, 15],
         [16, 17, 18, 19]],

        [[20, 21, 22, 23],
         [24, 25, 26, 27],
         [28, 29, 30, 31],
         [32, 33, 34, 35],
         [36, 37, 38, 39]]])

A.sum(axis=0) 与每一个的 每一行的各个元素相加。
tensor([[20, 22, 24, 26],
        [28, 30, 32, 34],
        [36, 38, 40, 42],
        [44, 46, 48, 50],
        [52, 54, 56, 58]])

'''
A_sum_axis1 = A.sum(axis=1)
A_sum_axis1
'''
tensor([[ 40,  45,  50,  55],
        [140, 145, 150, 155]])
'''
# A = torch.arange(20 * 2).reshape(2, 5, 4)
A_sum_axis01 = A.sum(axis=[0, 1])
A_sum_axis01
'''
tensor([180, 190, 200, 210])
'''

9.求平均值

A = torch.arange(20, dtype=torch.float32).reshape(5, 4)
'''
tensor([[ 0.,  1.,  2.,  3.],
        [ 4.,  5.,  6.,  7.],
        [ 8.,  9., 10., 11.],
        [12., 13., 14., 15.],
        [16., 17., 18., 19.]])
'''
A.sum() / A.numel()
A.mean()  # tensor(9.5000)

10.求维度的平均值

'''
tensor([[ 0.,  1.,  2.,  3.],
        [ 4.,  5.,  6.,  7.],
        [ 8.,  9., 10., 11.],
        [12., 13., 14., 15.],
        [16., 17., 18., 19.]])
'''
'''
A.sum(axis=0)
tensor([40., 45., 50., 55.])
A.shape[0] 为5
'''
A.sum(axis=0) / A.shape[0]
A.mean(axis=0)  # tensor([ 8.,  9., 10., 11.])

11.计算总和或均值保持轴数不变

sum_A = A.sum(axis=1)
sum_A  # tensor([ 6., 22., 38., 54., 70.]) 变了
sum_A = A.sum(axis=1, keepdims=True)
sum_A  # 不变[5,1]
'''
tensor([[ 6.],
        [22.],
        [38.],
        [54.],
        [70.]])
'''

12.通过广播A除以sum_A

'''
A
tensor([[ 0.,  1.,  2.,  3.],
        [ 4.,  5.,  6.,  7.],
        [ 8.,  9., 10., 11.],
        [12., 13., 14., 15.],
        [16., 17., 18., 19.]])
sum_A        
        tensor([[ 6.,6.,6.,6.],
        [22.,22.,22.,22.],
        [38.,38.,38.,38.],
        [54.,54.,54.,54.],
        [70.,70.,70.,70.]])
        
 A / sum_       
        tensor([[0.0000, 0.1667, 0.3333, 0.5000],
        [0.1818, 0.2273, 0.2727, 0.3182],
        [0.2105, 0.2368, 0.2632, 0.2895],
        [0.2222, 0.2407, 0.2593, 0.2778],
        [0.2286, 0.2429, 0.2571, 0.2714]])
'''

13计算某个轴的A元素累加和

'''
A
tensor([[ 0.,  1.,  2.,  3.],
        [ 4.,  5.,  6.,  7.],
        [ 8.,  9., 10., 11.],
        [12., 13., 14., 15.],
        [16., 17., 18., 19.]])
'''
A.cumsum(axis=1)
'''
tensor([[ 0.,  1.,  3.,  6.],
        [ 4.,  9., 15., 22.],
        [ 8., 17., 27., 38.],
        [12., 25., 39., 54.],
        [16., 33., 51., 70.]])
'''
A.cumsum(axis=0)
'''
tensor([[ 0.,  1.,  2.,  3.],
        [ 4.,  6.,  8., 10.],
        [12., 15., 18., 21.],
        [24., 28., 32., 36.],
        [40., 45., 50., 55.]])
'''

14dot点积是相同位置按元素乘积然后相加的和

x = torch.arange(4, dtype=torch.float32)  # tensor([0., 1., 2., 3.])
y = torch.ones(4, dtype=torch.float32)  # tensor([1., 1., 1., 1.])
torch.dot(x, y)  # tensor(6.)
# 等价于
torch.sum(x * y)

15.Ax是一个长度为m的列向量。向量*矩阵

'''
A
tensor([[ 0.,  1.,  2.,  3.],
        [ 4.,  5.,  6.,  7.],
        [ 8.,  9., 10., 11.],
        [12., 13., 14., 15.],
        [16., 17., 18., 19.]])
X
tensor([0., 1., 2., 3.])

'''
torch.mv(A, x)  # tensor([ 14.,  38.,  62.,  86., 110.])

16.两个矩阵乘法AB

A = torch.arange(20, dtype=torch.float32).reshape(5, 4)
B = torch.ones(4, 3)
torch.mm(A, B)
'''
tensor([[ 6.,  6.,  6.],
        [22., 22., 22.],
        [38., 38., 38.],
        [54., 54., 54.],
        [70., 70., 70.]])
'''

17.求向量或者矩阵的长度 norm 平方和后开根号

u = torch.tensor([3.0, -4.0])
torch.norm(u)  # tensor(5.)

18.向量元素绝对值的和

torch.abs(u).sum()  # tensor(7.)

19 矩阵元素的平方和 开根号
torch.norm(torch.ones((4, 9)))  # tensor(6.)

你可能感兴趣的:(深度学习,线性代数,python)