§ Problem Statement
Setup
Let be a nonincreasing sequence with for all and as . For each , let be independent random variables with . Define the centered empirical errors and
Using natural logarithms, define the complexity parameters
Cohen and Kontorovich (COLT 2023) proved that for all ,
for an absolute constant .
Unsolved Problem
Is the factor necessary? Does there exist an absolute constant such that for all such sequences and all ,
§ Discussion
§ Significance & Implications
This problem asks for the sharp (up to absolute constants) finite-sample behavior of the expected sup-norm estimation error when simultaneously estimating countably many Bernoulli means with nonuniform, decaying marginals. The current best general upper bound matches known lower-bound scalings in and except for a multiplicative in the term; removing it would yield a rate that is simultaneously consistent with both asymptotic lower bounds and would sharpen sample-size guarantees in regimes where dominates the error.
§ Known Partial Results
Cohen et al. (2023): (Characterization of consistency) Cohen and Kontorovich (2023) show that as if and only if .
Cohen et al. (2023): (Finite-sample upper bound) For all , they prove for an absolute constant .
Cohen et al. (2023): (Asymptotic lower bounds) They provide lower bounds of the form and (with a universal constant ), showing that the dependence on and is asymptotically necessary up to absolute constants.
Cohen et al. (2023): (Statistical interpretation) The binomial formulation corresponds to the coordinate-wise empirical mean error for product Bernoulli measures under : for i.i.d. samples with coordinate means , one has , and after sorting the coordinates by decreasing this reduces to the stated setting.
§ References
Open problem: log(n) factor in "Local Glivenko-Cantelli"
Doron Cohen, Aryeh Kontorovich (2023)
Conference on Learning Theory (COLT), PMLR 195
📍 Open-problem note in COLT proceedings.
Open problem: log(n) factor in "Local Glivenko-Cantelli" (PDF)
Doron Cohen, Aryeh Kontorovich (2023)
Conference on Learning Theory (COLT), PMLR 195
📍 Proceedings PDF.