Login / Signup

Global optimization in Hilbert space.

Boris HouskaBenoît Chachuat
Published in: Mathematical programming (2017)
We propose a complete-search algorithm for solving a class of non-convex, possibly infinite-dimensional, optimization problems to global optimality. We assume that the optimization variables are in a bounded subset of a Hilbert space, and we determine worst-case run-time bounds for the algorithm under certain regularity conditions of the cost functional and the constraint set. Because these run-time bounds are independent of the number of optimization variables and, in particular, are valid for optimization problems with infinitely many optimization variables, we prove that the algorithm converges to an ε -suboptimal global solution within finite run-time for any given termination tolerance ε > 0 . Finally, we illustrate these results for a problem of calculus of variations.
Keyphrases
  • machine learning
  • mental health
  • deep learning
  • neural network