“Deep Variational Transfer: Transfer Learning through Semi-supervised Deep Generative Models”

Abstract: “In real-world applications, it is expensive and time-consuming to obtain labeled examples. In such cases, knowledge transfer from related domains, where labels are abundant, could greatly reduce the need for extensive labeling efforts. In this scenario, transfer and multi-task learning come in hand.

In this paper, we propose Deep Variational Transfer (DVT), a variational autoencoder that transfers knowledge across domains using a shared latent Gaussian mixture model. More in details, we align all supervised examples of the same class into the same Gaussian Mixture component, independently of the domain. Meanwhile, we exploit semi-supervised learning to predict the class of the unsupervised examples in different domains. We show that DVT can tackle all major challenges posed by transfer learning: different feature spaces, different data distributions, different output spaces, different and highly imbalanced class distributions, absence of correspondences and the presence of irrelevant source domains.

We test DVT on images and astronomical datasets. We perform both a quantitative evaluation of the model discriminative ability and qualitative exploration of its generative capacity. DVT significantly outperforms current methods.”

 

Date: Jun 14, 2018 at 16:00 h
Venue: Sala de Seminarios John Von Neumann CMM, ubicada en Beauchef 851, torre norte. piso 7.
Speaker: Prof. Pavlos Protopapas
Affiliation: Institute for Applied Computational Science (IACS), Harvard University
Coordinator: Francisco Forster
Abstract:
PDF

Posted on Jun 4, 2018 in CMM Modeling, Seminars