Batch Normalization
Normalizing the input of your network is a well-established technique for improving the convergence properties of a network. A few years ago, a technique known as batch normalization was proposed...
Data Augmentation in Tensorflow
This post is a comprehensive review of Data Augmentation techniques for Deep Learning, specific to images. Data augmentation is one of the regularization technique. It consists of generating new training...
Weight Initialization Schemes - Xavier (Glorot) and He
When you are working with deep neural networks, initializing the network with the right weights can be the hard to deal with because Deep Neural Networks suffer from problems called...
Reading CSV file by using Tensorflow Data API and Splitting Tensor into Training and Test Sets for LSTM
There might be times when you have your data only in a one huge CSV file and you need to feed it into Tensorflow and at the same time, you...
Some Basic Activation Functions
Activation Functions Activation functions help in achieving non-linearity in deep learning models. If we don’t use these non-linear activation functions, neural network would not be able to solve the complex...