Kernel machines traditionally arise from an elegant formulation based on measuring the smoothness of the admissible solutions by the norm in the reproducing kernel Hilbert space (RKHS) generated by the chosen kernel. It was pointed out that they can be formulated in a related functional framework, in which the Green’s function of suitable differential operators is thought of as a kernel. In this letter, our own picture of this intriguing connection is given by emphasizing some relevant distinctions between these different ways of measuring the smoothness of admissible solutions. In particular, we show that for some kernels, there is no associated differential operator. The crucial relevance of boundary conditions is especially emphasized, which is in fact the truly distinguishing feature of the approach based on differential operators. We provide a general solution to the problem of learning from data and boundary conditions and illustrate the significant role played by boundary conditions with examples. It turns out that the degree of freedom that arises in the traditional formulation of kernel machines is indeed a limitation, which is partly overcome when incorporating the boundary conditions. This likely holds true in many real-world applications in which there is prior knowledge about the expected behavior of classifiers and regressors on the boundary.

You do not currently have access to this content.