pptx - Princeton University

Time-Space Tradeoffs
in Proof Complexity:
Superpolynomial Lower Bounds for
Superlinear Space
Chris Beck
Princeton University
Joint work with
Paul Beame & Russell Impagliazzo
Jakob Nordstrom & Bangsheng Tang
• The satisfiability problem is a central problem
in computer science, in theory and in practice.
• Terminology:
– A Clause is a boolean formula which is an OR of
variables and negations of variables.
– A CNF formula is an AND of clauses.
• Object of study for this talk:
CNF-SAT: Given a CNF formula, is it satisfiable?
Resolution Proof System
• Lines are clauses, one simple proof step
• Proof is a sequence of clauses each of which is
– an original clause or
– follows from previous ones via resolution step
• a CNF is UNSAT iff can derive empty clause ⊥
Proof DAG
General resolution: Arbitrary DAG
Tree-like resolution: DAG is a tree
SAT Solvers
• Well-known connection between Resolution
and SAT solvers based on Backtracking
• These algorithms are very powerful
– sometimes can quickly handle CNF’s with millions
of variables.
• On UNSAT formulas, computation history
yields a Resolution proof.
– Tree-like Resolution ≈ DPLL algorithm
– General Resolution ≿ DPLL + “Clause Learning”
• Best current SAT solvers use this approach
SAT Solvers
• DPLL algorithm: backtracking search for
satisfying assignment
– Given a formula which is not constant, guess a
value for one of its variables , and recurse on the
simplified formula.
– If we don’t find an assignment, set  the other
way and recurse again.
– If one of the recursive calls yields a satisfying
assignment, adjoin the good value of  and return,
otherwise report fail.
SAT Solvers
• DPLL requires very little memory
• Clause learning adds a new clause to the input
CNF every time the search backtracks
– Uses lots of memory to try to beat DPLL.
– In practice, must use heuristics to guess which
clauses are “important” and store only those.
Hard to do well! Memory becomes a bottleneck.
• Question: Is this inherent? Or can the right
heuristics avoid the memory bottleneck?
Proof Complexity & Sat Solvers
• Proof Size ≤ Time for Ideal SAT Solver
• Proof Space ≤ Memory for Ideal SAT Solver
• Many explicit hard UNSAT examples known
with exponential lower bounds for Resolution
Proof Size.
• Question: Is this also true for Proof Space?
Space in Resolution
Time step 
Must be in memory
• Clause space ≔ max(# active clauses).

[Esteban, Torán ‘99]
Lower Bounds on Space?
• Considering Space alone, tight lower and upper
bounds of (), for explicit tautologies of size .
• Lower Bounds: [ET’99, ABRW’00, T’01, AD’03]
• Upper Bound: All UNSAT formulas on  variables
have tree-like refutation of space ≤ .
[Esteban, Torán ‘99]
• For a tree-like proof, Space is at most the height of
the tree which is the stack height of DPLL search
• But, these tree-like proofs are typically too large
to be practical, so we should instead ask if both
small space and time are simultaneously feasible.
Size-Space Tradeoffs for Resolution
• [Ben-Sasson ‘01] Pebbling formulas with linear
size refutations, but for which all proofs have
Space ∙ log Size = Ω(n/log n).
• [Ben-Sasson, Nordström ‘10] Pebbling formulas
which can be refuted in Size O(n), Space O(n),
but Space O(n/log n)  Size exp(n(1)).
But, these are all for Space < , and SAT solvers
generally can afford to store the input formula in
memory. Can we break the linear space barrier?
Size-Space Tradeoffs
• Informal Question: Can we formally show that
memory rather than time can be a real
bottleneck for resolution proofs and SAT solvers?
• Formal Question (Ben-Sasson):
“Does there exist a  such that any CNF with a
refutation of size T also has a refutation of size T
in space O()?”
• Our results: Families of formulas of size n having
refutations in Time, Space nk, but all resolution
refutations have T > (n0.58 k/S)loglog n/logloglog n
Tseitin Tautologies
Given an undirected graph
:→2 , define a CSP:
Boolean variables:
Parity constraints:
(linear equations)
has odd total parity, CSP is UNSAT.
Tseitin Tautologies
• When  odd, G connected, corresponding CNF is
called a Tseitin tautology. [Tseitin ‘68]
• Only total parity of  matters
• Hard when G is a constant degree expander:
[Urqhart 87]: Resolution size = 2Ω()
[Torán 99]: Resolution space =Ω(E)
• This work: Tradeoffs on  ×  grid,  ≫ , and
similar graphs, using isoperimetry.
Tseitin formula on Grid
• Consider Tseitin formula on  ×  grid, =4
• How can we build a resolution refutation?
Tseitin formula on Grid
• One idea: Mimic linear algebra refutation
• If we add the lin eqn’s in any order, get 1 = 0.
• A linear equation on  variables corresponds
to 2 clauses. Resolution is implicationally
complete, so can simulate any step of this
with 2() -blowup.
Tseitin formula on Grid
• One idea: Mimic linear algebra refutation
• If we add the lin eqn’s in column order:
• Never have an intermediate equation with
more than  variables. Get a proof of Size ≈
# ∙ 2 , Space ≈ 2 .
Tseitin formula on Grid
• Different idea: Divide and conquer
• In DPLL, repeatedly bisect the graph
• Each time we cut, one of the components is
unsat. Done after   || queries, tree-like
proof with Space ≈   , Size () .
• Savitch-like savings in space, for  = ().
Tseitin formula on Grid
• “Chris, isn’t this just tree-width?” – Exactly.
• Our work is about whether this can be improved.
• The lower bound shows is that quasipolynomial
blowup in size when the space is below 2 is
necessary for the proof systems we studied.
• For technical reasons, work with “doubled” grid.
High Level Overview of Lower Bound
• [Beame Pitassi ’96] simplified previous Proof
Size lower bounds using random restrictions.
• A restriction  to a formula  is a partial
assignment of truth values to variables,
resulting in some simplification. We denote
the restricted formula by | .
• If Π is a proof of , Π| is a proof of | .
High Level Overview of Lower Bound
• Philosophy of [Beame, Pitassi ’96]
– Find  such that any refutation contains a
“complex clause”
– Find  and a random restriction which restricts φ’
to , but also kills any complex clause whp.
– Modern form of “bottleneck counting” due to
Haken. Can reinterpret this as finding a large
collection of assignments, one extending each
restriction, each passing through a wide clause.
High Level Overview of Lower Bound
• This work: Multiple classes of complex clauses
– Can state main idea as a pebbling argument.
– Suppose G is a DAG with T vertices that can be
scheduled in space S, and the vertices can be
broken into 4 classes, μ ∶  → {1,2,3,4}.
– Assume that for arcs (, ),   ≤ () + 1
– ll sources in 1, all sinks in 4. Everything in 2 is a
“complex clauses” of first type, everything in 3 is a
“complex clauses” of second type.
High Level Overview of Lower Bound
• Suppose that there is a distribution Π on
source-to-sink paths and a real number p s.t.:
– For any vertex , μ  ∈ 2,3 ,   ∈  ≤ p.
– For any two vertices 1 , 2 ,  1 = 2,  2 = 3,
 1 , 2 ∈  ≤ 2 .
• Then,  2  = Ω −3 .
High Level Overview of Lower Bound
• Proof:
– Let 1 …  be sequence of pebbling steps using
only S pebbles. Divide time into  epochs,  to
be determined later.
– Plot  vs. time
High Level Overview of Lower Bound
• Let  be any source-to-sink path. Claim:
– Either, π hits a vertex of level 2 and a vertex of
level 3 during the same epoch
– Or,  hits a vertex of level 2 or 3 during one of the
breakpoints between epochs.
High Level Overview of Lower Bound
• Now choose  randomly.
– Either, π hits a vertex of level 2 and a vertex of
level 3 during the same epoch
Probability of this: By a union bound, at most
( )2 2
High Level Overview of Lower Bound
• Now choose  randomly.
– Or,  hits a vertex of level 2 or 3 during one of the
breakpoints between epochs.
Probability of this: By a union bound, at most

High Level Overview of Lower Bound
• Now choose  randomly.
– Conclude 1 ≤ −1  2 2 + 
• Optimizing  yields  2  = Ω −3 .
• Previous such results only for expanders, etc.
• Gets stronger when you have more levels.
Later we will see that with  levels, get
log log 
High Level Overview of Lower Bound
• Goal for remainder of analysis:
– Show that any proof of Tseitin on doubled grid has
such a flow (using restrictions), for the best  and
as many complex clause classes as possible.
– It is enough that any k complex clauses from
distinct classes have collective width ≥  for
some large W. (For n x l grid, W will be n.)
– How did we get complex clauses before in [Beame
Pitassi ‘96]? A “measure of progress” argument…
Progress Measure
• The following progress measure construction
appeared in [Ben-Sasson and Wigderson ’01].
• Let 1 …  be a partition of clauses of an
unsat CNF. Assume it is minimally unsat.
• For any clause C, define
  ≔ min 
 ∈ ⊨
Progress Measure
•  is a sub-additive complexity measure:
 (initial clause) = 1,
 (⊥) = m,
 () ≤ (1 ) + (2), when 1, 2 ⊦ .
• In any refutation, there must occur a clause C
1 2
with   / ∈ [ , ]. In previous work,
3 3
choose 1 …  so that this implies C is wide.
Progress Measure for Tseitin
• For Tseitin tautologies, natural choice is to
have one  for each vertex v, containing all
associated clauses. So  is an 2 lin. eqn.
• Claim: Let C be any clause, and let  ⊆ ,
 ⊨  be any subset attaining the min,  =
(). Then, δ() ⊆ ().
Progress Measure for Tseitin
• Claim: Let C be any clause, and let  ⊆ ,
 ⊨ ,  = (). Then δ() ⊆ ().
• Proof:
• If  ⊨ , then  ⋀¬ is unsat linear system. So
we can derive 1 = 0 by linear combinations.
• If any equation from S has coeff 0, S wasn’t
minimal. In sum of equations of S, noncancelling
vars are exactly boundary edge vars. These must
cancel with ¬ to get 1=0, so δ() ⊆ ().
Isoperimetry → Lower Bounds
• Consequence: Tseitin formula has minimum
refutation width at least the balanced
bisection width of graph G.
• Corollary: Tseitin on a constant degree
expander requires width Ω().
• Corollary: On  ⨉  grid, needs width .
• Here, “Complex clause” := ()/m ∈
1 2
[ , ]
3 3
Size Lower Bounds from Width
• We have formulas with width lower bounds,
can use XOR-substitution and restrictions to
get size lower bounds.
• For  a CSP, [⨁] is the CSP over variable set
with two copies 1 , 2 of each var  of , and
each constraint of  interpretted as
constraining the parities (1 ⨁2 ) rather than
the variables . (Then expand as CNF.)
Size Lower Bounds from Width
• [⨁] has a natural random restriction :
– For each , independently choose one of 1 , 2 to
restrict to {0,1} at random. Substitute either  or
¬ for the other, so that 1 ⨁2 | = .
• Previously used in [B’01]. Nice properties:
–  ⨁ | = , always
– Fairly dense, kills all wide clauses whp.
– Black box, doesn’t depend on .
Size Lower Bounds from Width
• When  is a Tseitin tautology, [⨁] is
“multigraph-Tseitin” on same graph but where
each edge has been doubled.
• Suppose Tseitin on doubled grid has a proof of
size less than 2 . Then by a union bound,
some restriction kills all clauses of width , so
Tseitin on the grid has proof of width < .
Contradiction, for small enough c.
Additional Complex Clause Levels
• Idea: Instead of just considering C with
()/m ∈ [1/3, 2/3] , what about [1/6, 1/3],
[1/12, 1/16], etc.?
• Gives us the levelled property we need.
• Task: Show that clauses from k different levels
are collectively wide.
• Given the machinery we have, this is some
“extended isoperimetry” property of the grid.
Extended Isoperimetry
• Claim: Let 1 …  be subsets of vertices,
 ∈ [2 ,/2], of superincreasing sizes.
Then, δ( ) ≥ Ω().
• Several simple combinatorial proofs of this.
• Implies clauses from distinct levels are
collectively wide.
Main Lemma
• For any set  of  clauses in doubled-Tseitin,
Pr[| contains ≥  complex clauses
of  complexity levels]
≤ 2

Proof: For any k-tuple, Ω(kn) opportunities for
 to kill at least one. Union bound over k-tuples.
Complexity vs. Time
• Consider the time ordering of any proof, and
plot complexity of clauses in memory v. time
Input clauses
• Sub-additivity implies cannot skip over any
[t, 2t] window of μ-values on the way up.
Complexity vs. Time
• Consider the time ordering of any proof, and
divide time into  equal epochs (fix  later)
Two Possibilities
• Consider the time ordering of any proof, and
divide time into  equal epochs (fix  later)
• Either, a clause of medium complexity
appears in memory for at least one of the
breakpoints between epochs,
Two Possibilities
• Consider the time ordering of any proof, and
divide time into  equal epochs (fix  later)
• Or, all breakpoints only have Hi and Low. Must
have an epoch which starts Low ends Hi, and
so has clauses of all ≈log  Medium levels.
• Consider the restricted proof. With probability
1, one of the two scenarios applies.
• For first scenario,  =  clauses,
Pr 1st scenario ≤  exp(−)
• For second scenario,  epochs with  = /
clauses each. Union bound and main lemma:

Pr 2nd scenario ≤ 
exp(−Ω()) ,

where  ≈ log 
• Optimizing  yields something of the form
 = exp Ω 
for  = ω 1 .
• Can get a nontrivial tradeoff this way, but
need to do a lot of work on constants.
Better tradeoff
• Don’t just divide into epochs once
• Recursively divide proof into epochs and
sub-epochs where each sub-epoch contains 
sub-epochs of the next smaller size
• Prove that, if an epoch does a lot of work,
– Breakpoints contain many complexities
– A sub-epoch does a lot of work
An even better tradeoff

• If an epoch contains a clause at the end of
level , but every clause at start is level ≤ 
− , (so the epoch makes  progress),
• and the breakpoints of its  children epochs
contain together ≤  complexity levels,
• then some child epoch makes / progress.
Internal Node Size
= mS
Leaf Node Size
= T/m^(h-1)
Previous Slide shows:
Some set has ≥ 
complexities, where
 ℎ = log 
An even better tradeoff
• Choose ℎ = , have  = log , so  = (1)
• Choose  so all sets are the same size:
/−1 ≈ , so all events are rare together.
• Have  sets in total
Finally, a union bound yields  ≥

Final Tradeoff
• Tseitin formulas on  ×  grid for  = 24
– are of size ≈  .
– have resolution refutations in Size, Space ≈ 2
– have  log  space refutations of Size 2 log 
– and any resolution refutation for Size  and
Space  requires  > (2 0.58  /)ω(1)
If space is at most 2 /2 then size blows up
by a super-polynomial amount
Open Questions
• More than quasi-polynomial separations?
– For Tseitin formulas upper bound for small space
is only a log n power of the unrestricted size
– Candidate formulas? Are these even possible?
• Other proof systems?
– In [BNT’12], got Polynomial Calculus Resolution.
– Cutting Planes? Frege subsystems?
• Other cases for separating search paradigms:
“dynamic programming” vs. “binary search”?
Extended Isoperimetric Inequality
If the sets aren’t essentially blocks, we’re done.
If they are blocks, reduce to the line:
Intervals on the line
• Let 1 , 1 , … , [ ,  ] be intervals on the
line, such that  −  ≥ 2(−1 − −1 )
• Let () be the minimum number of distinct
endpoints of intervals in such a configuration.
• Then, a simple inductive proof shows   ≥
Proof DAG
Proof DAG
On every
root to leaf path,
no variable resolved
more than once.
Tradeoffs for Regular Resolution
• Theorem : For any k, 4-CNF formulas (Tseitin
formulas on long and skinny grid graphs)
of size n with
– Regular resolution refutations in size nk+1,
Space nk.
– But with Space only nk-, for any  > 0, any regular
resolution refutation requires size at least
n log log n / log log log n.
Regular Resolution
• Can define partial information more precisely
• Complexity is monotonic wrt proof DAG
edges. This part uses regularity assumption,
simplifies arguments with complexity plot.
• Random Adversary selects random
assignments based on proof
– No random restrictions, conceptually clean and
don’t lose constant factors here and there.

similar documents