The post will implement Multinomial Logistic Regression. The multiclass approach used will be one-vs-rest. The Jupyter notebook contains a full collection of Python functions for the implementation. An example problem done showing image classification using the MNIST digits dataset.
Machine Learning and Data Science: Logistic Regression Examples-1
This post will be mostly Python code with implementation and examples of the Logistic Regression theory we have been discussing in the last few posts. Examples include fitting to 2 feature data using an arbitrary order multinomial model and a simple 2 class image classification problem using the MNIST digits data.
Machine Learning and Data Science: Logistic Regression Implementation
In this post I’ll discuss evaluating the “goodness of fit” for a Logistic Regression model and do an implementation of the formulas in Python using numpy. We’ll look at an example to check the validity of the code.
Machine Learning and Data Science: Logistic and Linear Regression Regularization
In this post I will look at “Regularization” in order to address an important problem that is common with implementations, namely over-fitting. We’ll go through for logistic regression and linear regression. After getting the equations for regularization worked out we’ll look at an example in Python showing how this can be used for a badly over-fit linear regression model.
Machine Learning and Data Science: Logistic Regression Theory
Logistic regression is a widely used Machine Learning method for binary classification. It is also a good stepping stone for understanding Neural Networks. In this post I will present the theory behind it including a derivation of the Logistic Regression Cost Function gradient.
Machine Learning and Data Science: Linear Regression Part 6
This will be the last post in the Linear Regression series. We will look at the problems of over or under fitting data along with non-linear feature variables.
Machine Learning and Data Science: Linear Regression Part 5
In this post I will present the matrix/vector form of the Linear Regression problem and derive the “exact” solution for the parameters.
Machine Learning and Data Science: Linear Regression Part 4
In this post I’ll be working up, analyzing, visualizing, and doing Gradient Descent for Linear Regression. It’s a Jupyter notebook with all the code for plots and functions in Python available on my github account.
Machine Learning and Data Science: Linear Regression Part 3
In Part 3 of this series on Linear Regression I will go into more detail about the Model and Cost function. Including several graphs that will hopefully give insight into the their nature and serve as a reference for developing algorithms in the next post.
Machine Learning and Data Science: Linear Regression Part 2
In Part 2 of this series on Linear Regression I will pull a data-set of house sale prices and “features” from Kaggle and explore the data in a Jupyter notebook with pandas and seaborn. We will extract a good subset of data to use for our example analysis of the linear regression algorithms.