Organizing my Neural Network Codes

Amazing progress has been made in deep learning. I have been Tensorflow for a while now. I started out with tf0.6 then upgraded to tf0.12 then to tf1.0. The latest version is tf1.10 which is supposed to provide a stable API. I have a lot of code which has now become incompatible. The tf0.6's saver … Continue reading Organizing my Neural Network Codes

Convolutional Networks

Continuing further with Deep Learning, here I will briefly describe what I learned on convolutional network (CNN). If you understand the basics of a simple 2-layer network (fully connected) and can implement it yourself from scratch you are all set to understand the mighty daddy (ie. CNN). Again it is important to understand that CNN, … Continue reading Convolutional Networks

Deep Learning Overview

View my Deep Learning Overview : [Google Slides] Deep Learning Research Projects: [Google Slides] Beware, these things get out of date very quick. This presentation is from Oct 2016. The outline of the talk: Toy Neural Network Loss Function Stochastic Gradient Descent Forward-pass (Neural Function Evaluation) Backward-pass (Gradient of Neural function wrt to params) Recent … Continue reading Deep Learning Overview

Neural Network as Universal Approximators : Intuitive Explaination

Came across this wonderful explanation of why the neural network with hidden layer are universal approximators. Although not very helpful for practical purpose gives an intuitive feel of why neural network give reasonable results. The basic idea is to analyze a sigmoid function as you change w and b . In particular effect on $latex \sigma( w\times x … Continue reading Neural Network as Universal Approximators : Intuitive Explaination