Skip to content

This is the a collection of the basic machine learning exercises to solidify your concept and get a good understanding of the topics

Notifications You must be signed in to change notification settings

amancodeblast/ML-practice-exercise-octave

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Machine-Learning-Coursera-Stanford

This is the repository for my implementations on the Machine Learning course of Stanford University/Coursera.

Taught by Andrew Ng

Linear Regression

Linear Regression Plot Cost Function

Logistic Regression & Regularization

  • Logistic Regression & Regularization

  • Implemented logistic regression, binary classification cost function and gradient descent.

  • Implemented L2 regularization on cost function and gradient descent.

  • Example of logistic regression classification:

Logistic Regression Plot

  • Example of L2 regularization: Shows how a low regularization term (0) results in overfitting and a large one (100) in underfiting, and lambda=1 showing the trade off minimizing the square mean distance or the weigth values, best fit for the model.

Logistic Regression Plot with Lambda=0 Logistic Regression Plot with Lambda=100 Logistic Regression Plot with Lambda=1

Bias/Variance

Learning curve over linear regression. Polynomial Regression over training data. Learning curve over polynomial regression. Validation curve over linear regression for Lambda selection.

Support Vector Machine

  • SVM

  • Implemented a SVM spam email classifier.

  • Example of SVM Linear Kernel, overfitted with high C Example of SVM Linear Kernel, overfitted with high C

  • Example of SVM Gaussian Kernel Example of SVM Gaussian Kernel

  • Example of fitting a SVM: First image shows an underfit model for C=3, sigma=1. Second one, an overfitting model C=0.3, sigma= 0.03. And finally a good fit model C=0.3, sigma=0.1.

  • Increasing C pushes to minimize the error from the cost function, where sigma in the Gaussian kernel if decreased pushes to minimize distance between the data point and landmark.

SVM Underfit SVM overfit SVM good fit

K-means clustering and PCA

  • K-means clustering and PCA

  • Implemeneted K-Means and PCA Algorithms.

  • Example of K-Means centroid progression and point assigment: 1st Iteration and last 10th one. K-Means Iter=1 K-Means Iter=10

  • Example of PCA dimension reduction: Eigonvectors over data set, and projection of data point on the reduced dimension. PCA dim reduction PCA projected

  • Example of K-Mean and PCA over data set: K-Means group assigment and PCA dimension reduction: 3-D to 2-D. 3-D 2-D

Anomaly Detection and Recommender Systems

  • Anomaly Detection and Recommender Systems

  • Implemeneted anomaly detection with multi-variable gaussian distribution and recommender ystem.

  • Example of anomaly detection. This plot shows the gaussian distribution and how the data points are placed, red circles show the anomalies, p(x; u, covariance) < epsilon. Anomaly detection

  • Example of recommender system: "This dataset consists of ratings on a scale of 1 to 5. The dataset has nu = 943 users, and nm = 1682 movies." This first plot shows the movie rating per user, second image shows how the system is able to find similar movies to the ones high rated by the same user. This is done after the implementation of the regularized cost, gradients, and training of the model. Recommender Systems Recommender Systems

About

This is the a collection of the basic machine learning exercises to solidify your concept and get a good understanding of the topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages