Batch Normalization Embeddings for Deep Domain Generalization

Authors: Mattia Segù, Alessio Tonioni and Federico Tombari
Published in CVPR21, 2020

Abstract

Domain generalization aims at training machine learning models to perform robustly across different and unseen domains. Several recent methods use multiple datasets to train models to extract domain-invariant features, hoping to generalize to unseen domains. Instead, first we explicitly train domain-dependant representations by using ad-hoc batch normalization layers to collect independent domain’s statistics. Then, we propose to use these statistics to map domains in a shared latent space, where membership to a domain can be measured by means of a distance function. At test time, we project samples from an unknown domain into the same space and infer properties of their domain as a linear combination of the known ones. We apply the same mapping strategy at training and test time, learning both a latent representation and a powerful but lightweight ensemble model. We show a significant increase in classification accuracy over current state-of-the-art techniques on popular domain generalization benchmarks: PACS, Office-31 and Office-Caltech.

Paper

BibTex

@article{segu2020batch,
  title={Batch Normalization Embeddings for Deep Domain Generalization},
  author={Seg{\`u}, Mattia and Tonioni, Alessio and Tombari, Federico},
  journal={arXiv preprint arXiv:2011.12672},
  year={2020}
}