Solving the eigenvalue problem for differential operators is a common problem in many scientific fields. Classical numerical methods rely on intricate domain discretization and yield nonanalytic or nonsmooth approximations. We introduce a novel neural network–based solver for the eigenvalue problem of differential self-adjoint operators, where the eigenpairs are learned in an unsupervised end-to-end fashion. We propose several training procedures for solving increasingly challenging tasks toward the general eigenvalue problem. The proposed solver is capable of finding the M smallest eigenpairs for a general differential operator. We demonstrate the method on the Laplacian operator, which is of particular interest in image processing, computer vision, and shape analysis among many other applications. In addition, we solve the Legendre differential equation. Our proposed method simultaneously solves several eigenpairs and can be easily used on free-form domains. We exemplify it on L-shape and circular cut domains. A significant contribution of this work is an analysis of the numerical error of this method. In particular an upper bound for the (unknown) solution error is given in terms of the (measured) truncation error of the partial differential equation and the network structure.

You do not currently have access to this content.