« We consider sparse convex optimization problems, under general affine constraints. There are couplings between variables in both the cost function and constraints of the optimization problem. We propose a distributed Jacobi algorithm to solve this problem in a cooperative manner. Using a local update in each iteration which is equivalent to a convex combination of new local solutions and old iterates, the Jacobi algorithm guarantees to achieve feasible solutions at every iteration. We provide the a posteriori certification for centralized optimality of distributed solutions, and a priori conditions that guarantee convergence to optimality in several problem settings. The proposed approach is useful for distributed model predictive control applications where feasibility is an important requirement. It fosters distributing the computations, especially in settings with a large number of subsystems, a sparse coupling structure, and local communication is available. »
Ce séminaire est une présentation du Groupe d'études et de recheche en analyse des décisions (GERAD).
Entrée gratuite. Bienvenue à tous!
Campus de l'Université de Montréal
2920, chemin de la Tour
Montréal QC H3T 1J4
Université de Montréal