Abstract
The binary classification problem has a situation where only biased data are observed in one of the classes. In this letter, we propose a new method to approach the positive and biased negative (PbN) classification problem, which is a weakly supervised learning method to learn a binary classifier from positive data and negative data with biased observations. We incorporate a method to correct the negative influence due to a skewed confidence, which is represented by the posterior probability that the observed data are positive. This reduces the distortion of the posterior probability that the data are labeled, which is necessary for the empirical risk minimization of the PbN classification problem. We verified the effectiveness of the proposed method by synthetic and benchmark data experiments.