### pptx - Grigory Yaroslavtsev

```Property Testing and
Communication Complexity
Grigory Yaroslavtsev
http://grigory.us
Property Testing
[Goldreich, Goldwasser, Ron, Rubinfeld, Sudan]
Randomized algorithm
YES
⇒
Property tester
Accept with

probability ≥

YES
-far
No
⇒
Reject with

probability ≥

No
⇒
Accept with

probability ≥

⇒ Don’t care
⇒ Reject with
probability ≥
-far : ≥  fraction has to be changed to become YES

Property Testing
[Goldreich, Goldwasser, Ron, Rubinfeld, Sudan]
Property  = set of YES instances
Query complexity of testing :
•
() = Non-adaptive (all queries at once)
1
•  () = Queries in  rounds (

=

())
For error 1 − : ,  =  log 1/
Communication Complexity [Yao’79]
Shared randomness
1 ()
2 (1 , )
(, ) =?
3 (2 , )
Alice:
…
(, )
Bob:
•   = min. communication (error 1/3)
•   = min. -round communication (error 1/3)
/2-disjointness ⇒ -linearity
[Blais, Brody,Matulef’11]
• -linear function: 0,1

→ {0,1}
⊕∈  = 1 ⊕ 2 ⊕ ⋯ ⊕
where || =
• /2-Disjointness: ,  ⊆ [],  =  =

2
(, ) = 1, iff  ∩  = 0.
: | ∩ | = 0?
Alice:  ⊆
,  = /
Bob:
⊆  ,  = /
/2-disjointness ⇒ -linearity
[Blais, Brody,Matulef’11]
=  ⊕
⊆  ,  = /
=⊕∈
⊆  ,  = /
=⊕∈
•  ∩  = ∅ ⇒  is -linear
•  ∩  ≠ ∅ ⇒  is (< )-linear, ½-far from -linear
• Test  for -linearity using shared randomness
• To evaluate   exchange  () and  () (2 bits)
•

−Disjointness
2
≤ 2 ⋅ ½ −Linearity
-Disjointness
•  −Disjointness = Θ  [Razborov, Hastad-Wigderson]
•  −Disjointness = Θ( log )
[Folklore + Dasgupta, Kumar, Sivakumar; Buhrman’12, Garcia-Soriano,
Matsliah, De Wolf’12]
{
•  −Disjointness = Θ  ilog   ,
where   = log log … log  [Saglam, Tardos’13]
times

Ω

ilog

=

−Linearity
=
( log )
/
•  −Disjointness =  + o()[Braverman, Garg,
Pankratov, Weinstein’13]
•  (-Intersection) = Ω    ,
[Brody, Chakrabarti, Kondapally, Woodruff, Y.]
Communication Direct Sums
“Solving m copies of a communication problem
requires m times more communication”:
= Ω   ()
• For arbitrary  [… Braverman, Rao 10; Barak
Braverman, Chen, Rao 11, ….]
• In general, can’t go beyond
(, ) = 1 iff  = , where ,  ∈ 0,1
= O 1

,
= ()
Specialized Communication Direct Sums
Information cost ≤ Communication complexity
• R Disjointness = Ω  [Bar Yossef, Jayram,
Kumar,Sivakumar’01]
Disjointness(, ) =
(¬
∨ ¬ )
• Stronger direct sum for Equality-type problems (a.k.a.
“union bound is optimal”) [Molinaro, Woodruff, Y.’13]
1   = Ω  log  1 ()
• Bounds for  (  ),  (-Set Intersection) via
Information Theory [Brody, Chakrabarty, Kondapally,
Woodruff, Y.’13]
Direct Sums in Property Testing [Woodruff, Y.]
• Testing linearity:  is linear if  =⊕∈
• Equality: ,  ⊆ [] decide whether  =
=  ⊕
⊆
=⊕∈ (2−1 ∧ 2 )
⊆
=⊕∈ (2−1 ∧ 2 )
•  =  ⇒  is linear
•  ≠  ⇒  is ¼ -far from linear
Direct Sums in Property Testing [Woodruff, Y.]
•  (E) = Ω(log 1/) ⇒ 1/4 ()=
1
Ω(log

) (matching [Blum, Luby, Rubinfeld])
• Strong Direct Sum for Equality [MWY’13]⇒
Strong Direct Sum for Testing Linearity
1 Lin ≥

1
=
Ω  log  =
Ω  log  1 Lin
Property Testing Direct Sums [Goldreich’13]
• Direct Sum [Woodruff, Y.]:

Solve  with probability ≥
• Direct -Sum[Goldreich’13]:

2
3
2
3
Solve  with probability ≥ per instance
• Direct -Product[Goldreich’13]:
All instances are in  .
∃ instance –far from
[Goldreich ‘13]
For all properties :
• Direct m-Sum (solve all w.p. 2/3 per instance)
(
) = Θ(  ())
1
(
) = Θ(  ())
• Direct -Product (All in  . ∃ -far instance?)

= Θ(  ())
1 ())
Ω( 1 ()) =  (

)
=
(
log

Reduction from Simultaneous Communication
[Woodruff]
()
Alice:
()
Referee: (, )
Bob:
•   = min. simultaneous complexity of
• ,→ (), ,→ () ≤ ()
• GAF: 0,1 2+2 log  → {0,1} [Babai, Kimmel, Lokam]
GAF , , ,  = ⊕ if  = , 0 otherwise
1,→  =  log  , but S(GAF) = Ω( )
Property testing lower bounds via CC
• Monotonicity, Juntas, Low Fourier degree,
Small Decision Trees [Blais, Brody, Matulef’11]
• Small-width OBDD properties [Brody, Matulef,
Wu’11]
• Codes [Goldreich’13, Gur, Rothblum’13]
• Number of relevant variables [Ron, Tsur’13]
All functions are over Boolean hypercube
Functions

, = monotone functions over
1 , = Ω  log

Previous for monotonicity on the line ( = 1):
• 1 , = Θ log  [Ergun, Kannan,
Kumar, Rubinfeld, Viswanathan’00]
•  , = Ω log  [Fischer’04]
Functions

• Thm. Any non-adaptive tester for
monotonicity of :  →  has complexity
Ω(min(log , log ))
• Proof.
– Reduction from Augmented Index
– Basis of Walsh functions
Functions

• Augmented Index: S, (,  ∩ [ − 1])
1 ()
⊆
∈  ,  ∩ [ − 1]
• 1 Augmented Index = Ω() [Miltersen,
Nisan, Safra, Wigderson, 98]
Functions

Walsh functions: For  ⊆  ,  : [2 ] → {−1,1}:
= ∈ −1  ,
where  is the -th bit of .
=

=
…
=

Functions

Step functions: For  ∈  ,  : 2 → 2− :
= ⌈/2 ⌉
2 =

Functions

• Augmented Index ⇒ Monotonicity Testing
= 2  + ∩[−1,…,]
⊆
∈  ,  ∩ [ − 1]
•  ∉  ⇒  is monotone
•  ∈  ⇒  is ¼ -far from monotone
• Thus,  , = Ω log
Functions

• , = monotone functions over
1 , = Ω  log
• , = -Lipschitz functions over

• ,
= separately convex functions over
• , = convex functions over

Thm. [BRY] For all these properties 1 = Ω  log
These bounds are optimal for , and ,