deep learning日常小Bug汇总

“千里之堤,溃于蚁穴;”一些细节可能导致整个程序的问题,将在吴恩达作业中个人遇到的小bug总结一下。

1.在做吴恩达深度学习课第一课第三周作业时,发现梯度不变,后来找到原因是因为1和1.0。。

def compute_cost(A2,Y,parameters):
    m=Y.shape[1]
    logprobs=np.multiply(np.log(A2),Y)+np.multiply((1-Y),np.log(1-A2))
    cost=-1.0/m*np.sum(logprobs)
    cost=np.squeeze(cost)
    assert(isinstance(cost,float))
    return cost1.0/m*np.sum(logprobs)
    cost=np.squeeze(cost)
    assert(isinstance(cost,float))
    return cost

 

2.Lessen1,week4,assignment4_2

 

除以255和255.

# Reshape the training and test examples 
train_x_flatten = train_x_orig.reshape(train_x_orig.shape[0], -1).T   # The "-1" makes reshape flatten the remaining dimensions
test_x_flatten = test_x_orig.reshape(test_x_orig.shape[0], -1).T

# Standardize data to have feature values between 0 and 1.
train_x = train_x_flatten/255.
test_x = test_x_flatten/255.

print ("train_x's shape: " + str(train_x.shape))
print ("test_x's shape: " + str(test_x.shape))255.
test_x = test_x_flatten/255.

print ("train_x's shape: " + str(train_x.shape))
print ("test_x's shape: " + str(test_x.shape))

3.Lessen1,week4,assignment4_2中dnn_app_utils_v2.py有误,增加红色部分1.0*

def predict(X, y, parameters):
    """
    This function is used to predict the results of a  L-layer neural network.
    
    Arguments:
    X -- data set of examples you would like to label
    parameters -- parameters of the trained model
    
    Returns:
    p -- predictions for the given dataset X
    """

    m = X.shape[1]
    n = len(parameters) // 2  # number of layers in the neural network
    p = np.zeros((1, m))

    # Forward propagation
    probas, caches = L_model_forward(X, parameters)

    # convert probas to 0/1 predictions
    for i in range(0, probas.shape[1]):
        if probas[0, i] > 0.5:
            p[0, i] = 1
        else:
            p[0, i] = 0

    # print results
    # print ("predictions: " + str(p))
    # print ("true labels: " + str(y))
    print("Accuracy: " + str(np.sum(1.0*(p == y) / m)))

    return p1.0*(p == y) / m)))

    return p

 

你可能感兴趣的:(Debug,deep,learning)