Machine Learning - Regularized Logistic Regression

This series of articles are the study notes of " Machine Learning ", by Prof. Andrew Ng., Stanford University. This article is the notes of week 3, Solving the Problem of Overfitting, Part III. This article contains some topic about how to implementation logistic regression with regularization to addressing overfitting.


Regularized logistic regression


For logistic regression, we previously talked about two types of optimization algorithms. We talked about how to use gradient descent to optimize as cost function J of theta. And we also talked about advanced optimization methods. In this section, we'll show how you can adapt both of those techniques, both gradient descent and the more advanced optimization techniques in order to have them work for regularized logistic regression.
So long as you apply regularization and keep the parameters small you're more likely to get a decision boundary in pink.
Machine Learning - Regularized Logistic Regression_第1张图片

1. Cost Function of Logistic Regression

  • This is the original cost function for logistic regression without regularization

And if we want to modify it to use regularization, all we need to do is add to it the following term plusλ/2m, sum from j = 1, and as usual sum fromj = 1 soon up to theta n from being too large.

  • This is the cost function for logistic regression with regularization



2. Gradient descent for logistic regression


Gradient descent for logistic regression (without regularization)

Machine Learning - Regularized Logistic Regression_第2张图片

we can write the equation separately as follow

Machine Learning - Regularized Logistic Regression_第3张图片

Gradient descent for logistic regression (with regularization)

Machine Learning - Regularized Logistic Regression_第4张图片

We're working out gradient descent for regularized linear regression. And of course, just to wrap up this discussion,this term here in the square brackets is the new partial derivative for respect ofθj of the new cost functionJ(θ). WhereJ(θ) here is the cost function we defined on a previous that does use regularization.

Machine Learning - Regularized Logistic Regression_第5张图片

3. Advanced Optimization


Let's talk about how to get regularized linear regression to work using the more advanced optimization methods. And just to remind you for those methods what we needed to do was to define the function that's called the cost function, that takes us input the parameter vector theta and once again in the equations we've been writing here we used 0 index vectors.
Machine Learning - Regularized Logistic Regression_第6张图片

你可能感兴趣的:(Mooc,Machine,Learning,Bin博的机器视觉工作间,机器学习,Logistic,Regression,Machine,Learning,algorithm,regularizaiton)