A few weeks ago I wrote a blog post titled Should You Learn to Program with Python. If you read that and decided the answer is yes then this post is for you.
NVIDIA GTX 1080Ti Performance for Machine Learning — as Good as TitanX?
How good is the NVIDIA GTX 1080Ti for CUDA accelerated Machine Learning workloads? About the same as the TitanX! I ran a Deep Neural Network training calculation on a million image dataset using both the new GTX 1080Ti and a Titan X Pascal GPU and got very similar runtimes.
Should You Learn to Program with Python
The short answer to that question is, yes. If you want to know why you would want to do that then read on.
PCIe X16 vs X8 for GPUs when running cuDNN and Caffe
Does PCIe X16 give better performance than X8 for training models with Caffe when using cuDNN? Yes, but not by much!
NVIDIA DIGITS with Caffe – Performance on Pascal multi-GPU
NVIDIA’s Pascal GPU’s have twice the computational performance of the last generation. A great use for this compute capability is for training deep neural networks. We have tested NVIDIA DIGITS 4 with Caffe on 1 to 4 Titan X and GTX 1070 cards. Training was for classification of a million image data set from ImageNet. Read on to see how it went.
NVIDIA DIGITS Install
If you are happy to use Ubuntu 14.04 LTS (Ubuntu-MATE in our case) then setting up a system with the NVIDIA DIGITS software stack is simple. I’ll give you some guidance on getting everything working, from the Linux install to the DIGITS web interface.
What is Machine Learning
Machine Learning is getting a lot of attention these days and with good reason. There are mountains of data to work with and computing resources to handle the problems are easily attainable. Even a single GPU accelerated workstation is capable of serious work.