On Chow-Liu Forest Based Regularization of Deep Belief Networks

A. Sarishvili, A. Wirsen, M. Jirstrand. In Proceedings of the 28th International Conference on Artificial Neural Networks 2019, Munich, Germany, 17-19 September, 2019.


In this paper we introduce a methodology for the simple integration of almost-independence information on the visible (input) variables of the restricted Boltzmann machines (RBM) into the weight decay regularization of the contrastive divergence and stochastic gradient descent algorithm. After identifying almost independent clusters of the input coordinates by Chow-Liu tree and forest estimation, the RBM regularization strategy is constructed. We show an example of a sparse two hidden layer Deep Belief Net (DBN) applied on the MNIST data classification problem. The performance is quantified by estimating misclassification rate and measure of manifold disentanglement. Approach is benchmarked to the full model.


Photo credits: Nic McPhee