How to use embedding layer and other feature columns together in a network using Keras?
Why should you use an embedding layer? One-Hot encoding is a commonly used method for converting a categorical input variable into continuous variable. For every level present, one new variable...
Poisson Regression in Tensorflow
In most of the classification problems, we have binary response variable. Now, let’s assume that we can only take non-negative integer values, i.e., 0, 1, 2,….Although it is very similar...
How to get total number of parameters in Tensorflow
This is a function which gives the total number of parameters in Tensorflow: #TOTAL NUMBER OF PARAMETERS total_parameters = 0 for variable in graph.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES): # shape is an array of...
Function to create batches of data
This is a basic function to create batches from a data set. import math import numpy as np def miniBatch(x, y, batchSize): numObs = x.shape[0] batches = [] batchNum =...
Relation between Maximum Likelihood and KL-Divergence
Maximizing likelihood is equivalent to minimizing KL-Divergence. Considering $P( \cdot \mid \theta^{*})$ is the true distribution and $P(\cdot \mid \theta)$ is our estimate, we already know that KL-Divergence is written...