Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
In this thesis, we reveal that supervised learning and inverse problems share similar mathematical foundations. Consequently, we are able to present a unified variational view of these tasks that we formulate as optimization problems posed over infinite-dimensional Banach spaces. Throughout the thesis, we study this class of optimization problems from a mathematical perspective. We start by specifying adequate search spaces and loss functionals that are derived from applications. Next, we identify conditions that guarantee the existence of a solution and we provide a (finite) parametric form for the optimal solution. Finally, we utilize these theoretical characterizations to derive numerical solvers.The thesis is divided into five parts. The first part is devoted to the theory of splines, a large class of continuous-domain models that are optimal in many of the studied frameworks. Our contributions in this part include the introduction of the notion of multi-splines, their theoretical properties, and shortest-support generators. In the second part, we study a broad class of optimization problems over Banach spaces and we prove a general representer theorem that characterizes their solution sets. The third and fourth parts of the thesis invoke the applicability of our general framework to supervised learning and inverse problems, respectively. Specifically, we derive various learning schemes from our variational framework that inherit a certain notion of "sparsity" and we establish the connection between our theory and deep neural networks, which are state-of-the-art in supervised learning. Moreover, we deploy our general theory to study continuous-domain inverse problems with multicomponent models, which can be applied to various signal and image processing tasks, in particular, curve fitting. Finally, we revisit the notions of splines and sparsity in the last part of the thesis, this time, from a stochastic perspective.