吴恩达机器学习课后lab C1_W1_Lab04_Gradient_Descent_Soln-checkpoint(梯度下降函数运行)

梯度下降函数

  • 代码块1
  • 代码块2
  • 代码块3(计算损失(代价函数))
  • 梯度下降函数
  • 代码块4

代码块1

import math, copy
import numpy as np
import matplotlib.pyplot as plt
plt.style.use('./deeplearning.mplstyle')
from lab_utils_uni import plt_house_x, plt_contour_wgrad, plt_divergence, plt_gradients

导入一些包

代码块2

# Load our data set
x_train = np.array([1.0, 2.0])   #features
y_train = np.array([300.0, 500.0])   #target value

代码块3(计算损失(代价函数))

#Function to calculate the cost
def compute_cost(x, y, w, b):
   
    m = x.shape[0] 
    cost = 0
    
    for i in range(m):
        f_wb = w * x[i] + b
        cost = cost + (f_wb - y[i])**2
    total_cost = 1 / (2 * m) * cost

    return total_cost

梯度下降函数

吴恩达机器学习课后lab C1_W1_Lab04_Gradient_Descent_Soln-checkpoint(梯度下降函数运行)_第1张图片

代码块4

计算这两个参数
吴恩达机器学习课后lab C1_W1_Lab04_Gradient_Descent_Soln-checkpoint(梯度下降函数运行)_第2张图片

def compute_gradient(x, y, w, b): 
    """
    Computes the gradient for linear regression 
    Args:
      x (ndarray (m,)): Data, m examples 
      y (ndarray (m,)): target values
      w,b (scalar)    : model parameters  
    Returns
      dj_dw (scalar): The gradient of the cost w.r.t. the parameters w
      dj_db (scalar): The gradient of the cost w.r.t. the parameter b     
     """
    
    # Number of training examples
    m = x.shape[0]    
    dj_dw = 0
    dj_db = 0
    
    for i in range(m):  
        f_wb = w * x[i] + b 
        dj_dw_i = (f_wb - y[i]) * x[i] 
        dj_db_i = f_wb - y[i] 
        dj_db += dj_db_i
        dj_dw += dj_dw_i 
    dj_dw = dj_dw / m 
    dj_db = dj_db / m 
        
    return dj_dw, dj_db

你可能感兴趣的:(吴恩达机器学习课上lab,python,人工智能)