Restricted Boltzmann Machines (RBMs) is an unsupervised machine learning method that significantly contributes the early rise of deep learning in 2006: A Fast Learning Algorithm for Deep Belief Networks. Since then it becomes an important element in the Deep Learning family. Interestingly, RBMs manifest the idea of both probabilistic graphical models and neural networks. This post aims to provide an introduction to RBMs. Most of the formulas here are from .
Boltzmann Machines are energy-based models  where the joint probability distribution is characterized by a scalar energy to each configuration of the variables. In energy-based models, inference consists of clamping the values of a set of variables, and finding configurations of the remaining variables that minimize the energy function; learning consists in finding an energy function that minimizes the energy of observed variables. Boltzmann machines are also probabilistic graphical models using graph-based representation to factorize the probability distribution. Restricted Boltzmann Machines is a type of Boltzmann machine with special constraints – only a certain type of connection is allowed.
This post starts by introducing energy-based models, including the graphical representation of the model and its learning with gradient descent of log-likelihood. This post then discusses Boltzmann machines by placing a specific energy function in energy-based models. Restricted Boltzmann Machines are further discussed with the introduction of restrictions in Boltzmann Machines connections.