From Weak to Strong LP Gaps for All CSPs
24 Aug, 2017
We study the approximability of constraint satisfaction problems (CSPs) by linear programming (LP) relaxations.
We show that for every CSP, the approximation obtained by a basic LP
relaxation, is no weaker than the approximation obtained using relaxations given by $\Omega(\log n/\log \log n$ levels of the Sherali-Adams hierarchy on instances of size $n$.
It was proved by Chan et al. [FOCS 2013] (and recently strengthened by Kothari et al.
[STOC 2017]) that for CSPs, any polynomial size LP extended formulation is no stronger
than relaxations obtained by a super-constant levels of the Sherali-Adams hierarchy.
Combining this with our result also implies that any polynomial size LP extended
formulation is no stronger than simply the ``basic'' LP, which can be thought of as the base
level of the Sherali-Adams hierarchy. This essentially gives a dichotomy result for approximation of CSPs by polynomial size LP extended formulations.
Based on joint work with Madhur Tulsiani