The current JupyterHub version 2.5.1 does not allow user installed extension for JupyterLab when it is being served from JupyterHub. This should be remedied in version 3. However, even when this is “fixed” it is still useful to be able to install extensions globally for all users on a multi-user system. This note will show you how.
How To Run Remote Jupyter Notebooks with SSH on Windows 10
Being able to run Jupyter Notebooks on remote systems adds tremendously to the versatility of your workflow. In this post I will show a simple way to do this by taking advantage of some nifty features of secure shell (ssh). What I’ll do is mostly OS independent but I am putting an emphasis on Windows 10 since many people are not familiar with tools like ssh on that OS.
Beginning with Machine Learning and AI
I can’t think of of trending field of scientific research that has ever been better suited for “beginners” than Machine Learning and AI. Even though the field has been around for decades it feels like day one. There is now a perfect convergence of resources to facilitate the learning and doing of Machine Learning.
Machine Learning and Data Science: Multinomial (Multiclass) Logistic Regression
The post will implement Multinomial Logistic Regression. The multiclass approach used will be one-vs-rest. The Jupyter notebook contains a full collection of Python functions for the implementation. An example problem done showing image classification using the MNIST digits dataset.
Machine Learning and Data Science: Logistic Regression Examples-1
This post will be mostly Python code with implementation and examples of the Logistic Regression theory we have been discussing in the last few posts. Examples include fitting to 2 feature data using an arbitrary order multinomial model and a simple 2 class image classification problem using the MNIST digits data.
Machine Learning and Data Science: Logistic Regression Implementation
In this post I’ll discuss evaluating the “goodness of fit” for a Logistic Regression model and do an implementation of the formulas in Python using numpy. We’ll look at an example to check the validity of the code.
Machine Learning and Data Science: Logistic and Linear Regression Regularization
In this post I will look at “Regularization” in order to address an important problem that is common with implementations, namely over-fitting. We’ll go through for logistic regression and linear regression. After getting the equations for regularization worked out we’ll look at an example in Python showing how this can be used for a badly over-fit linear regression model.
Machine Learning and Data Science: Logistic Regression Theory
Logistic regression is a widely used Machine Learning method for binary classification. It is also a good stepping stone for understanding Neural Networks. In this post I will present the theory behind it including a derivation of the Logistic Regression Cost Function gradient.
Machine Learning and Data Science: Linear Regression Part 6
This will be the last post in the Linear Regression series. We will look at the problems of over or under fitting data along with non-linear feature variables.
Machine Learning and Data Science: Linear Regression Part 5
In this post I will present the matrix/vector form of the Linear Regression problem and derive the “exact” solution for the parameters.