Label Efficient Learning of Transferable Representations acrosss Domains and Tasks

Zelun Luo1
Yuliang Zou2
Judy Hoffman3
Li Fei-Fei1
Stanford University1, Virginia Tech2, University of California, Berkeley3
Conference on Neural Information Processing Systems (NIPS 2017)



We propose a framework that learns a representation transferable across different domains and tasks in a data efficient manner. Our approach battles domain shift with a domain adversarial loss, and generalizes the embedding to novel task using a metric learning-based approach. Our model is simultaneously optimized on labeled source data and unlabeled or sparsely labeled data in the target domain. Our method shows compelling results on novel classes within a new domain even when only a few labeled examples per class are available, outperforming the prevalent fine-tuning approach. In addition, we demonstrate the effectiveness of our framework on the transfer learning task from image object recognition to video action recognition.


Paper

Label Efficient Learning of Transferable Representations acrosss Domains and Tasks

Zelun Luo, Yuliang Zou, Judy Hoffman, Li Fei-Fei

In NIPS 2017

[pdf]
[bibtex]
[slide]
[code]


Experiment 1: SVHN -> MNIST



Experiment 2: ImageNet -> UCF101



Ablation: Unsupervised domain adaptation