week1.5 Specialization overview

  1. How to conect the reality with machine learning(build a model.)

  2. Regession techines: linear regession, ridge regression, lasso regression.

  3. Optimization techines: gradient descent, coordinate descent

  4. Key concepts: loss functions, bias-variance tradeoffs, cross-validation.

  5. But then add kernels and decision trees which allow you to deal with non-linear complex features. We talked about optimization methods for dealing with these techniques at scale and for building ensembles of them something called boosting.

  6. So we're gonna talk about basic techniques like nearest neighbors as well as more advanced clustering techniques, mixture of Gaussians, and even latent Dirichlet allocation can advance text analysis clustering technique. We're gonna talk about the algorithms that underpin these things and how to scale them up with techniques like KD-trees and sampling and expectation maximization.

7.Now the core concepts here are really around how to scale these things up, how to measure the quality and really how to write them as distributed algorithms using techniques like map-reduce, which are implementing systems like Hadoop that you might have learned about.

8.Now in the final technical course we're gonna focus on techniques of matrix factorization and dimensionality reduction, which are widely applicable, but in particular for recommender systems,
for recommending products.

So these are things like collaborative filtering, matrix factorization, PCA, and the underlying techniques for optimizing them, like coordinate descent, Eigen decomposition, SVD.

你可能感兴趣的:(week1.5 Specialization overview)