Remove polylogarithmic dimension factors in high-dimensional Berry--Esseen bounds for $m$-dependent sums
Sourced from the work of Heejong Bong, Arun Kumar Kuchibhotla, Alessandro Rinaldo
§ Problem Statement
Setup
Let , , and let be random vectors in . Assume:
-
are mean-zero: for all .
-
are -dependent, meaning that for every , the sigma-fields and are independent.
-
A uniform -moment bound holds: for some finite constant , .
-
For
is nondegenerate in coordinates, e.g. for some .
Let , and let be the class of axis-aligned hyper-rectangles
Define the rectangle Kolmogorov distance
Unsolved Problem
Under the assumptions above, does there exist a constant depending only on fixed model parameters (for example only on , , and ), but independent of , such that for all and all such -dependent arrays,
that is, with no additional multiplicative factor?
§ Discussion
§ Significance & Implications
Bong, Kuchibhotla, and Rinaldo’s arXiv record is currently 2306.14299v3 (latest arXiv version; revised 2025-08-29), and the work is listed as accepted at Annals of Statistics (2025). Their bounds are sharp in and up to logarithmic factors in ; removing (or proving unavoidable) these dimension-log factors would clarify optimal high-dimensional Gaussian approximation rates under -dependence.
§ Known Partial Results
Bong et al. (2025): This paper proves sharp high-dimensional bounds with only polylogarithmic dependence on and optimal / scaling . In univariate settings, matching optimal rates are known (up to logs as stated in the abstract).
§ References
Dual Induction CLT for High-dimensional m-dependent Data
Heejong Bong, Arun Kumar Kuchibhotla, Alessandro Rinaldo (2025)
Annals of Statistics (accepted, 2025)
📍 Section 3 (Discussion), item 2, p. 14 (arXiv v3 manuscript).
Source paper where this problem appears; latest arXiv version is v3 (revised 2025-08-29).