Revised: May 8, 2018

Published: June 6, 2018

**Keywords:**constraint satisfaction problem, convex programming, linear programming hierarchy, integrality gap

**Categories:**algorithms, approximation algorithms, constraint satisfaction, convex programming, convex optimization, linear programming, linear programming hierarchy, integrality gap, CCC, CCC 2017 special issue

**ACM Classification:**F.2.2, G.1.6

**AMS Classification:**68Q17, 90C05

**Abstract:**
[Plain Text Version]

We study the approximability of constraint satisfaction problems (CSPs) by linear programming (LP) relaxations. We show that for every CSP, the approximation obtained by a basic LP relaxation is at least as strong as the approximation obtained using relaxations given by $c \cdot \log n/\log \log n$ levels of the Sherali--Adams hierarchy (for some constant $c > 0$) on instances of size $n.$

It was proved by Chan et al. [FOCS 2013] (and recently strengthened by
Kothari et al. [STOC 2017]) that for CSPs, any polynomial-size
LP extended formulation is at most as strong as the relaxation
obtained by a constant number of levels of the Sherali--Adams
hierarchy (where the number of levels depend on the exponent of
the polynomial in the size bound). Combining this with our result
also implies that any polynomial-size LP extended formulation is
at most as strong as the *basic* LP, which can be thought of as
the base level of the Sherali--Adams hierarchy. This essentially
gives a dichotomy result for approximation of CSPs by
polynomial-size LP extended formulations.

Using our techniques, we also simplify and strengthen the result by Khot et al. [STOC 2014] on (strong) approximation resistance for LPs. They provided a necessary and sufficient condition under which $\littleoh(\log \log n)$ levels of the Sherali--Adams hierarchy cannot achieve an approximation better than a random assignment. We simplify their proof and strengthen the bound to $\littleoh\left(\log n/\log \log n\right)$ levels.

A conference version of this paper appeared in the Proceedings of the 32nd Computational Complexity Conference (CCC'17).