Partially Resolved

Adaptive Minimax Nonparametric Hypothesis Testing

§ Problem Statement

Setup

Fix a dimension d1d \geq 1, constants L>0L>0, α,β(0,1)\alpha,\beta \in (0,1), and a known baseline function f0L2([0,1]d)f_0 \in L_2([0,1]^d). In the periodic Gaussian white-noise model on Td=[0,1]d\mathbb{T}^d=[0,1]^d,

dY(x)=f(x)dx+n1/2dW(x),xTd,dY(x)=f(x)\,dx+n^{-1/2}dW(x), \qquad x\in \mathbb{T}^d,

let

W2s(L)={fL2(Td):kZd(1+k22)sθk(f)2L2},s>0,\mathcal W_2^s(L)=\left\{f\in L_2(\mathbb{T}^d):\sum_{k\in\mathbb{Z}^d}(1+\|k\|_2^2)^s |\theta_k(f)|^2\le L^2\right\}, \qquad s>0,

and test

H0:f=f0vsH1(s,ρ):fW2s(L), ff0L2ρ.H_0:f=f_0 \quad\text{vs}\quad H_1(s,\rho): f\in \mathcal W_2^s(L),\ \|f-f_0\|_{L_2}\ge \rho.

For fixed ss, the non-adaptive minimax separation radius is known to satisfy

ρn(s)n2s/(4s+d).\rho_n^*(s)\asymp n^{-2s/(4s+d)}.

Classical part (substantially understood): for several compact smoothness-range formulations (typically s[s,s+]s\in[s_-,s_+] with 0<s<s+<0<s_-<s_+<\infty in Gaussian sequence/white-noise settings), exact adaptation (cn1c_n\equiv 1) is impossible and the optimal adaptive loss is of log-log type; in Spokoiny's normalization this appears as a factor

tε=(loglogε2)1/4,t_\varepsilon=(\log\log \varepsilon^{-2})^{1/4},

with ε=n1/2\varepsilon=n^{-1/2} (equivalently, a (loglogn)1/4(\log\log n)^{1/4}-type factor in that parametrization).

Unsolved Problem

Determine the sharp adaptive minimax rate outside those settled classical compact-range cases, especially for genuinely noncompact or otherwise broader regimes (for example S=(0,)\mathcal S=(0,\infty), higher-dimensional/anisotropic families, or other model variations), and identify the minimal penalty cn(s,d,S)c_n(s,d,\mathcal S) such that one test sequence controls type I/II errors uniformly over all sSs\in\mathcal S.

§ Discussion

Loading discussion…

§ Significance & Implications

Adaptive nonparametric testing is a core question in mathematical statistics: unlike many estimation problems, testing can require a provable adaptation penalty. Classical compact-range Gaussian settings already show unavoidable log-log losses (with model-dependent parametrization), while broader regimes remain unresolved. Clarifying exactly where adaptation is fully characterized versus still open is important for goodness-of-fit, signal detection, and related inference tasks.

§ Known Partial Results

  • Spokoiny (1996): proves impossibility of full adaptation in the considered wavelet setting and derives a log-log adaptation factor tε=(lnlnε2)1/4t_\varepsilon=(\ln\ln\varepsilon^{-2})^{1/4} (equivalently (loglogn)1/4(\log\log n)^{1/4} when ε=n1/2\varepsilon=n^{-1/2} in that parametrization).

  • Ingster & Suslina (2003): develops sharp non-adaptive minimax testing theory for Gaussian models, including Sobolev-type classes.

  • Spokoiny (1996): Hence classical compact smoothness-range formulations are not a blanket open problem: the main unresolved part is the sharp adaptive frontier in broader regimes (notably noncompact smoothness ranges and related generalizations).

§ References

[1]

Adaptive hypothesis testing using wavelets

Vladimir Spokoiny (1996)

Annals of Statistics

📍 Section 2.3 (Adaptive testing), especially Theorems 2.2-2.3; the adaptation factor is stated as $t_\varepsilon=(\ln\ln\varepsilon^{-2})^{1/4}$ (not $\varepsilon^{-1}$), pp. 2481-2482.

[2]

Nonparametric Goodness-of-Fit Testing Under Gaussian Models

Yuri Ingster, Irina Suslina (2003)

Springer Series in Statistics (book)

§ Tags