Theory of Computing
-------------------
Title : From Weak to Strong Linear Programming Gaps for All Constraint Satisfaction Problems
Authors : Mrinalkanti Ghosh and Madhur Tulsiani
Volume : 14
Number : 10
Pages : 1-33
URL : https://theoryofcomputing.org/articles/v014a010
Abstract
--------
We study the approximability of constraint satisfaction problems
(CSPs) by linear programming (LP) relaxations. We show that for every
CSP, the approximation obtained by a basic LP relaxation is at least
as strong as the approximation obtained using relaxations given by $c
\cdot \log n/\log \log n$ levels of the Sherali--Adams hierarchy (for
some constant $c > 0$) on instances of size $n$.
It was proved by Chan et al. [FOCS 2013] (and recently
strengthened by Kothari et al. [STOC 2017]) that for CSPs, any
polynomial-size LP extended formulation is at most as strong as the
relaxation obtained by a constant number of levels of the Sherali--
Adams hierarchy (where the number of levels depend on the exponent of
the polynomial in the size bound). Combining this with our result also
implies that any polynomial-size LP extended formulation is at most as
strong as the _basic_ LP, which can be thought of as the base level of
the Sherali--Adams hierarchy. This essentially gives a dichotomy
result for approximation of CSPs by polynomial-size LP extended
formulations.
Using our techniques, we also simplify and strengthen the
result by Khot et al. [STOC 2014] on (strong) approximation resistance
for LPs. They provided a necessary and sufficient condition under
which $\littleoh(\log \log n)$ levels of the Sherali--Adams hierarchy
cannot achieve an approximation better than a random assignment. We
simplify their proof and strengthen the bound to
$\littleoh\inparen{\log n/\log \log n}$ levels.
A conference version of this paper appeared in the Proceedings of
the 32nd Computational Complexity Conference (CCC'17).