Generalization Bounds for Deep Transfer Learning Using Majority Predictor Accuracy

Published in International Symposium on Information Theory and Its Applications (ISITA), Tsukuba, Japan, 2022

Recommended citation: Cuong N.Nguyen, Lam Si Tung Ho, Vu Dinh, Tal Hassner, and Cuong V.Nguyen. Generalization Bounds for Deep Transfer Learning Using Majority Predictor Accuracy. Int. Symp. on Information Theory and Its Applications (ISITA), Tsukuba, Japan, 2022.

Abstract

We analyze new generalization bounds for deep learning models trained by transfer learning from a source to a target task. Our bounds utilize a quantity called the majority predictor accuracy, which can be computed efficiently from data. We show that our theory is useful in practice since it implies that the majority predictor accuracy can be used as a transferability measure, a fact that is also validated by our experiments.

arXiv preprint

Bibtex