KL divergence (Kullback-Leibler57) or KL distance is non-symmetric measure of difference between two probability distributions. It is related to mutual information and can be used to measure the association between two random variables.In this short tutorial, I show how to compute KL divergence and mutual information for two categorical variables, interpreted as discrete random variables.${bf Definition}$: Kullback-Leibler (KL) Distance...