|Neural-Network Learning by Optimization on Differentiable Manifolds|
In neural-network learning, an objective function describes the performance of an adaptive system about the solution of a posed scientific problem, while constraints arise by the physical or the formal nature of the problem itself. In the case that the constraints are expressed by equalities and form feasible spaces that possess the structure of a smooth manifold, the optimization methods may take advantage of the techniques based on differential geometry with special emphasis to the numerical integration methods to solve differential equations over smooth manifolds and Lie groups. An example that is worth mentioning as a particularly interesting case is the one about the computation of ensemble averages over metrizable curved spaces. The best-known optimization method in the literature to find the points of local minimum of a regular function over an Euclidean space is the gradient steepest descent, that gives rise to a first–order dynamical system. By utilizing the instruments of differential geometry it is possible to extend such a method to find the points of local minimum of a regular function over a smooth manifold. Moreover, it has been envisaged that second–order dynamical system, that retrace the structure of mechanical–type dynamical systems, yield optimization methods with different features with respect to first–order ones, over Euclidean spaces. The dynamical systems related to such optimization methods, of the first–order as well as of the second–order, are represented by differential equations over smooth manifolds and Lie groups, that need to be properly integrated via numerical calculus techniques.
|Home Page: https://web.dibet.univpm.it/fiori|
|Contact Person: Simone Fiori|
Ongoing University Projects (FinAte), JSPS (Japan Society for the Promotion of Science) Exchange Projects