The course continues ECE236B and covers several advanced and current topics in optimization, with an emphasis on large-scale algorithms for convex optimization. This includes first-order methods for large-scale optimization (gradient and subgradient method, conjugate gradient method, proximal gradient method, accelerated gradient methods), decomposition and splitting methods (dual decomposition, augmented Lagrangian method, alternating direction method of multipliers, monotone operators and operator splitting), and (possibly) interior-point algorithms for conic optimization.
2. Subgradients
14. Newton's method
Conic optimization and interior-point methods
First-order methods
Localization and cutting-plane methods
A collection of exercises from homework assignments and discussions.