top of page

Deep Leaning using MxNet from R on a CPU and a GPU

I have been trying to learn about Statistical Machine Learning and parallel processing for many years now. I have been discussing the ideas of parallel processing and Machine Learning with students. In my preparation of class materials and examples I have been trying to implement Machine Learning algorithms from R.

This academic year I was able to get hold of an Nvidia gforce GTX 1070 and have finally been able to get it installed and configured on a linux machine running ubuntu.

I have installed the Nvidia drivers so the video card can be used. I have installed MxNet to run some deep learning examples. I was very happy to see that there was an R package that made it possible to call MxNet from R.

I have run a few samples. The main example I have have been trying out is the Handwritten Digit Recognition mnist example from the MxNet website.

Just to see that it is working I have recorded a couple of asciinema videos.

These videos are not so interesting. The cpu video just shows nmon demonstrating that the MxNet deep learning algorithm is parallel processing. And the gpu video (around 3 mins in) shows watch nvidia-smi demonstrating that the MxNet deep learning algorithm is using the gpu.

I consider this an interesting start for my development of further course materials and discussion around parallel processing, machine learning, and the use of gpu computing.

bottom of page