25 Janvier - 31 Janvier


Retour à la vue des calendrier
Jeudi 28 Janvier
Heure: 12:15 - 13:30
Lieu: Salle B107, bâtiment B, Université de Villetaneuse
Résumé: Non negative matrix factorization for transfer learning
Description: Ievgen Redko The ability of a human being to extrapolate previously gained knowledge
to other domains inspired a new family of methods in machine learning
called transfer learning. Transfer learning is often based on the assumption
that objects in both target and source domains share some common feature
and/or data space. If this assumption is false, most of transfer learning
algorithms are likely to fail. In this work, we propose to investigate the
problem of transfer learning from both theoretical and applicational points
of view.

First, we introduce a theoretical framework based on the Hilbert-Schmidt
embeddings that allows us to improve the current state-of-the-art theoretical
results on transfer learning by introducing a natural and intuitive
distance measure with strong computational guarantees for its estimation.
The proposed results combine the tightness of data-dependent bounds derived
from Rademacher learning theory while ensuring the efficient estimation
of its key factors.

We also present two different methods to solve the problem of unsupervised
transfer learning based on Non-negative matrix factorization techniques.
First one represents a linear approach that aims at discovering
an embedding for two tasks that decreases the distance between the corresponding
probability distributions while preserving the non-negativity
property. Second one proceeds using an iterative optimization procedure
that aims at aligning the kernel matrices calculated based on the data from
two tasks.