2022吴恩达机器学习specialization Week 2 practice lab: Linear regression

Exercise 1

# UNQ_C1
# GRADED FUNCTION: compute_cost

def compute_cost(x, y, w, b): 
    """
    Computes the cost function for linear regression.
    
    Args:
        x (ndarray): Shape (m,) Input to the model (Population of cities) 
        y (ndarray): Shape (m,) Label (Actual profits for the cities)
        w, b (scalar): Parameters of the model
    
    Returns
        total_cost (float): The cost of using w,b as the parameters for linear regression
               to fit the data points in x and y
    """
    # number of training examples
    m = x.shape[0] 
    
    # You need to return this variable correctly
    total_cost = 0
    
    ### START CODE HERE ###  
    
    total_cost = sum(pow((y-(x*w+b)),2))/(2*m)
    
    ### END CODE HERE ### 

    return total_cost

Exercise 2

# UNQ_C2
# GRADED FUNCTION: compute_gradient
def compute_gradient(x, y, w, b): 
    """
    Computes the gradient for linear regression 
    Args:
      x (ndarray): Shape (m,) Input to the model (Population of cities) 
      y (ndarray): Shape (m,) Label (Actual profits for the cities)
      w, b (scalar): Parameters of the model  
    Returns
      dj_dw (scalar): The gradient of the cost w.r.t. the parameters w
      dj_db (scalar): The gradient of the cost w.r.t. the parameter b     
     """
    
    # Number of training examples
    m = x.shape[0]
    
    # You need to return the following variables correctly
    dj_dw = 0
    dj_db = 0
    
    ### START CODE HERE ### 
    dj_dw = sum(x*(x*w+b-y))/m
    dj_db = sum((x*w+b)-y)/m
    ### END CODE HERE ### 
        
    return dj_dw, dj_db

视频链接:https://www.bilibili.com/video/BV19B4y1W76i

你可能感兴趣的:(吴恩达机器学习,机器学习,线性回归,逻辑回归)