Hierarchies, Extended Formulations and Matrix-Analytic Techniques

source: Simons Institute     2017年11月6日
An important development in the area of convex relaxations has been the introduction of systematic ways of strengthening them by lift-and-project techniques. This leads to several hierarchies of LP/SDP relaxations: Lovasz-Schrijver, Sherali-Adams and Sum of Squares (also known as the Lasserre hierarchy). The benefits and limitations of these hierarchies have been studied extensively over the last decade. Recently, strong negative results have been obtained, not only for specific hierarchies but even for the more general notion of extended formulations.
In this workshop we investigate the power and limitations of LP/SDP hierarchies as well as general extended formulations, and their ties to convex algebraic geometry. We also explore tools and concepts from matrix analysis with strong connections to SDP formulations: matrix concentration, matrix multiplicative weight updates, and various notions of matrix rank. Finally, the workshop will cover related areas where these kinds of techniques are employed: sparsification, discrepancy and hyperbolic/real stable polynomials.
For more information, please visit: https://simons.berkeley.edu/workshops...
These presentations were supported in part by an award from the Simons Foundation.

1:07:05 Constructing Extended Formulations
33:35 Small (Explicit) Extended Formulation for Knapsack Cover Inequalities from Monotone Circuits
27:36 Computing the Nucleolus of Weighted Cooperative Matching Games in Polynomial Time
29:09 A Lower Bound on the Positive Semidefinite Rank of Convex Bodies
31:57 Bounds for Matrix Factorization Ranks via Semidefinite Programming and...
29:38 Using Symmetry in Semidefinite Programming
30:26 Sparse Polynomial Interpolation: Compressed Sensing, Super-resolution, or Prony?
1:02:23 Pseudocalibration and SoS Lower Bounds
27:20 Weak Decoupling, Polynomial Folds, and Approximate Optimization over the Sphere
10 30:10 Spectral Aspects of Symmetric Matrix Signings
11 31:52 From Weak to Strong LP Gaps for all CSPs
12 31:55 LP, SOCP, and Optimization-Free Approaches to Polynomial Optimization
13 28:54 Matrix Completion and Sums of Squares
14 33:21 SOS and the Dreaded Bit-complexity
15 29:13 Maximizing Sub-determinants and Connections to Permanents and Inequalities on Stable Polynomials
16 32:20 Spectrahedra and Directional Derivatives of Determinants
17 1:06:43 Counting and Optimization Using Stable Polynomials
18 58:02 Independent Set Polytopes: Query vs. Communication Perspective
19 39:03 Approximating Rectangles by Juntas and Weakly-Exponential Lower Bounds for LP Relaxations of CSPs
20 1:04:06 From Proofs to Algorithms for Machine Learning Problems
21 26:59 Nonnegative Polynomials, Nonconvex Polynomial Optimization, and Applications to Learning
22 39:00 Fast Spectral Algorithms from Sum-of-Squares Analyses
23 30:15 Computing the Independence Polynomial in Shearer's Region for the LLL
24 35:40 Quantum Entanglement, Sum of Squares, and the Log Rank Conjecture

No comments: