读书笔记:均方误差与交叉熵误差的Python实现 ← 斋藤康毅

均方误差公式E=\frac{1}{2}\sum_k \left ( y_{k}-t_{k} \right )^{2}

【均方误差的Python代码】

import numpy as np

def mean_squared_error(y,t):
    return 0.5*np.sum((y-t)**2)

t=[0,0,1,0,0,0,0,0,0,0]
y=[0.1,0.05,0.6,0.0,0.05,0.1,0.0,0.1,0.0,0.0]
mean_squared_error(np.array(y),np.array(t))     #0.09750000000000003
import numpy as np

def mean_squared_error(y,t):
    return 0.5*np.sum((y-t)**2)

t=[0,0,1,0,0,0,0,0,0,0]
y=[0.1,0.05,0.1,0.0,0.05,0.1,0.0,0.6,0.0,0.0]
mean_squared_error(np.array(y),np.array(t))     #0.5975


交叉熵误差公式E=-\sum_k t_{k} \log y_k

【交叉熵误差的Python代码】

import numpy as np
def cross_entropy_error(y,t):
    delta=1e-7
    return -np.sum(t*np.log(y+delta))

t=[0,0,1,0,0,0,0,0,0,0]
y=[0.1,0.05,0.6,0.0,0.05,0.1,0.0,0.1,0.0,0.0]
cross_entropy_error(np.array(y),np.array(t))     #0.510825457099338
import numpy as np
def cross_entropy_error(y,t):
    delta=1e-7
    return -np.sum(t*np.log(y+delta))

t=[0,0,1,0,0,0,0,0,0,0]
y=[0.1,0.05,0.1,0.0,0.05,0.1,0.0,0.6,0.0,0.0]
cross_entropy_error(np.array(y),np.array(t))     #2.302584092994546



 

你可能感兴趣的:(深度学习与人工智能,Python程序设计,均方误差,交叉熵误差)