METRIC 2011 Trimester at Institut Henri Poincaré (Paris, France)
Workshop on Expanders and derandomization (March 21-25, 2011)
Mar 24, 15:00-16:00 - Raghu Meka (U. Texas, Austin)
Pseudorandom Generators for Combinatorial Shapes
----
We construct pseudorandom generators for combinatorial shapes, which substantially generalize combinatorial rectangles, 0/1 halfspaces, and 0/1 modular sums. A function f:[m]^n -> {0,1} is an (m,n)-combinatorial shape if there exist sets A_1,...,A_n \subseteq [m] and a symmetric function h:{0,1}^n -> {0,1} such that f(x_1,\ldots,x_n) = h(1_{A_1}(x_1),...,1_{A_n}(x_n)).
Our generator uses seed length O(log m + log n + \log^2(1/\eps)) to get error \eps. When m =2, this gives the first generator of seed length O(log n) which fools all weight-based tests, meaning that the distribution of the weight of any subset is eps-close to the appropriate binomial distribution in statistical distance.
For our proof we give a simple lemma which allows us to convert closeness in Kolmogorov (cdf) distance to closeness in statistical distance. As a corollary of our technique, we give an alternative proof of a variant of the classical central limit theorem showing convergence in statistical distance, instead of the usual Kolmogorov distance.
Joint work with Parikshit Gopalan, Omer Reingold and David Zuckerman.
Workshop on Expanders and derandomization (March 21-25, 2011)
Mar 24, 15:00-16:00 - Raghu Meka (U. Texas, Austin)
Pseudorandom Generators for Combinatorial Shapes
----
We construct pseudorandom generators for combinatorial shapes, which substantially generalize combinatorial rectangles, 0/1 halfspaces, and 0/1 modular sums. A function f:[m]^n -> {0,1} is an (m,n)-combinatorial shape if there exist sets A_1,...,A_n \subseteq [m] and a symmetric function h:{0,1}^n -> {0,1} such that f(x_1,\ldots,x_n) = h(1_{A_1}(x_1),...,1_{A_n}(x_n)).
Our generator uses seed length O(log m + log n + \log^2(1/\eps)) to get error \eps. When m =2, this gives the first generator of seed length O(log n) which fools all weight-based tests, meaning that the distribution of the weight of any subset is eps-close to the appropriate binomial distribution in statistical distance.
For our proof we give a simple lemma which allows us to convert closeness in Kolmogorov (cdf) distance to closeness in statistical distance. As a corollary of our technique, we give an alternative proof of a variant of the classical central limit theorem showing convergence in statistical distance, instead of the usual Kolmogorov distance.
Joint work with Parikshit Gopalan, Omer Reingold and David Zuckerman.
Category
🤖
Technologie