Abstract

Learning can be made more efficient if we can actively select particularly salient data points. Within a Bayesian learning framework, objective functions are discussed that measure the expected informativeness of candidate measurements. Three alternative specifications of what we want to gain information about lead to three different criteria for data selection. All these criteria depend on the assumption that the hypothesis space is correct, which may prove to be their main weakness.

This content is only available as a PDF.

Author notes

*Present address: Cavendish Laboratory, Madingley Road, Cambridge, CB3 0HE, United Kingdom.