Currently viewing the category: "Explorations"

Neural Networks Simplified Part 1

This post begins a series of tutorials on neural networks. It starts very simply with a linear model and several methods for fitting the linear model to a set of data. The tutorial is in the form of a jupyter notebook which you can download.  

Comparing Numpy, Pytorch, and autograd on CPU and GPU

Comparing Numpy, Pytorch, and autograd on CPU and GPU¶ by Chuck Anderson, Pattern Exploration This post is available for downloading as this jupyter notebook. Table of Contents¶ Very Brief Introduction to Autograd Using Numpy to Fit a Polynomial to Data Now, with Pytorch Pytorch with Autograd Pytorch with autograd on GPU Wrapped up in one [...]

Numpy versus Pytorch

Numpy versus Pytorch¶ by Chuck Anderson, Pattern Exploration Here the accuracy and computation time of the training of simple fully-connected neural networks using numpy and pytorch implementations and applied to the MNIST data set are compared. The Adam optimization algorithm in numpy and pytorch are compared, as well as the Scaled Conjugate Gradient optimization algorithm [...]

Fast Reinforcement Learning After Pretraining

Fast Reinforcement Learning After Pretraining¶ by Chuck Anderson, Pattern Exploration We presented at IJCNN, 2015 the following paper, which won the Best Paper Award Anderson, C., Lee, M., and Elliott, D., “Faster Reinforcement Learning After Pretraining Deep Networks to Predict State Dynamics“, Proceedings of the IJCNN, 2015, Killarney, Ireland. Abstract: Deep learning algorithms have recently [...]