Explora I+D+i UPV

Volver atrás Publicación

Mask selective regularization for restricted Boltzmann machines

Compartir
Año

Revista

Neurocomputing

Abstract

In the present work, we propose to deal with two important issues regarding to the RBM's learning capabilities. First, the topology of the input space, and second, the sparseness of the RBM obtained. One problem of RBMs is that they do not take advantage of the topology of the input space. In order to alleviate this lack, we propose to use a surrogate of the mutual information of the input representation space to build a set of binary masks. This approach is general and not only applicable to images, thus it can be extended to other layers in the standard layer-by-layer unsupervised learning. On the other hand, we propose a selective application of two different regularization terms, L-1 and L-2, in order to ensure the sparseness of the representation and the generalization capabilities. Additionally, another interesting capability of our approach is the adaptation of the topology of the network during the learning phase by means of selecting the best set of binary masks that fit the current weights configuration. The performance of these new ideas is assessed with a set of experiments on different well-known corpus. (C) 2015 Elsevier B.V. All rights reserved.