“The Many Faces of Robustness: A Critical Analysis of Out-Of-Distribution Generalization”, Dan Hendrycks, Steven Basart, Norman Mu, Saurav Kadavath, Frank Wang, Evan Dorundo, Rahul Desai, Tyler Zhu, Samyak Parajuli, Mike Guo, Dawn Song, Jacob Steinhardt, Justin Gilmer2020-06-29 (, ; backlinks)⁠:

We introduce 4 new real-world distribution shift datasets consisting of changes in image style [ImageNet-Renditions (ImageNet-R)], image blurriness [Real Blurry Images], geographic location [Street View StoreFronts (SVSF)], camera operation [DeepFashion Remixed (DFR)], and more.

With our new datasets, we take stock of previously proposed methods for improving out-of-distribution robustness and put them to the test.

We find that using larger models and artificial data augmentations can improve robustness on real-world distribution shifts, contrary to claims in prior work. We find improvements in artificial robustness benchmarks can transfer to real-world distribution shifts, contrary to claims in prior work.

Motivated by our observation that data augmentations can help with real-world distribution shifts, we also introduce a new data augmentation method which advances the state-of-the-art and outperforms models pretrained with 1,000× more labeled data. Overall we find that some methods consistently help with distribution shifts in texture and local image statistics, but these methods do not help with some other distribution shifts like geographic changes.

Our results show that future research must study multiple distribution shifts simultaneously, as we demonstrate that no evaluated method consistently improves robustness.