Many machine learning methods assume that the training and test data follow the same distribution. However, in the real world, this assumption is often violated. In particular, the marginal distribution of the data changes, called covariate shift, is one of the most important research topics in machine learning. We show that the well-known family of covariate shift adaptation methods is unified in the framework of information geometry. Furthermore, we show that parameter search for a geometrically generalized covariate shift adaptation method can be achieved efficiently. Numerical experiments show that our generalization can achieve better performance than the existing methods it encompasses.

You do not currently have access to this content.