This letter considers Bayesian binary classification where data are assumed to consist of multiple time series (panel data) with binary class labels (binary choice). The observed data can be represented as {yit, xit}T,t=1i = 1, … , n. Here yit ∈  {0, 1} represents binary choices, and xit represents the exogenous variables. We consider prediction of yit by its own lags, as well as by the exogenous components. The prediction will be based on a Bayesian treatment using a Gibbs posterior that is constructed directly from the empirical error of classification. Therefore, this approach is less sensitive to the misspecification of the probability model compared to the usual likelihood-based posterior, which is confirmed by Monte Carlo simulations. We also study the effects of various choices of n and T both numerically (by simulations) and theoretically (by considering two alternative asymptotic situations: large n and large T). We find that increasing T helps to reduce the prediction error more effectively compared to increasing n. We also illustrate the method in a real data application on the brand choice of yogurt purchases.

You do not currently have access to this content.