Unsolved

Optimal adaptation beyond compact manifolds

Sourced from the work of Tao Tang, Nan Wu, Xiuyuan Cheng, David Dunson

§ Problem Statement

Setup

Let D1D\ge 1 and let SRDS\subset \mathbb R^D be a compact predictor domain. Assume the predictors X1,,XnX_1,\dots,X_n are i.i.d. from a probability measure PXP_X supported on SS, and responses satisfy

Yi=f0(Xi)+εi,εii.i.d.N(0,σ2), σ2(0,).Y_i=f_0(X_i)+\varepsilon_i,\qquad \varepsilon_i\stackrel{\text{i.i.d.}}{\sim}N(0,\sigma^2),\ \sigma^2\in(0,\infty).

Place a squared-exponential GP prior restricted to SS,

fAGP ⁣(0, KA(x,x)),KA(x,x)=exp ⁣(A2xx2),f\mid A\sim \mathrm{GP}\!\left(0,\ K_A(x,x')\right),\qquad K_A(x,x')=\exp\!\big(-A^2\|x-x'\|^2\big),

with a data-driven or hierarchical prior on bandwidth AA that does not use the unknown pair (d,β)(d,\beta).

What is proved in the cited work is the compact-manifold case: for intrinsically β\beta-smooth targets on compact smooth manifolds, RKHS approximation bounds are established and adaptive posterior contraction rates are derived at the minimax exponent, up to logarithmic factors.

Unsolved Problem

Obtain analogous RKHS approximation conditions on genuinely non-manifold supports. Assume only low-dimensional metric complexity, e.g. for some d(0,D]d\in(0,D] and c1,c2,δ0>0c_1,c_2,\delta_0>0,

c1δdN(S,δ)c2δdfor all 0<δ<δ0,c_1\,\delta^{-d}\le N(S,\delta)\le c_2\,\delta^{-d}\quad\text{for all }0<\delta<\delta_0,

where N(S,δ)N(S,\delta) is the covering number. For f0Fβ(S)f_0\in\mathcal F_\beta(S) (an intrinsic β\beta-smooth class on SS), identify geometric assumptions beyond manifold structure under which one can prove, for large AA, existence of hAHAh_A\in\mathbb H_A with

hAf0L2(PX)Aβ,hAHA2Ad,\|h_A-f_0\|_{L^2(P_X)}\lesssim A^{-\beta}, \qquad \|h_A\|_{\mathbb H_A}^2\lesssim A^{d},

allowing at most controlled logarithmic losses when necessary, and thereby obtain adaptive contraction

εnnβ/(2β+d)\varepsilon_n\asymp n^{-\beta/(2\beta+d)}

up to log factors with AA-prior independent of (d,β)(d,\beta).

The general non-manifold characterization (necessary/sufficient geometric conditions for such RKHS bounds and rates) remains open.

§ Discussion

Loading discussion…

§ Significance & Implications

The abstract of Tang et al. (2024) indicates optimality is obtained on compact manifolds using a novel RKHS approximation analysis, suggesting geometry is crucial for sharp rates. Extending optimal guarantees to broader intrinsic structures would substantially widen the theory's applicability to realistic data supports.

§ Known Partial Results

  • Tang et al. (2024): The paper proves RKHS approximation to intrinsically defined H"older functions on compact manifolds of any smoothness order and derives adaptive contraction rates there at the optimal exponent, up to logarithmic factors. For more general low-dimensional non-manifold structures, comparable RKHS approximation conditions are not yet established.

§ References

[1]

Adaptive Bayesian regression on data with low intrinsic dimensionality

Tao Tang, Nan Wu, Xiuyuan Cheng, David Dunson (2024)

arXiv preprint; Annals of Statistics (to appear)

📍 arXiv v3 (2024), Section 6 (Discussion), first paragraph beginning "It would be interesting to develop RKHS approximation analysis on a more general low-dimensional domain X..." (p. 14).

Primary source for this problem. Author order follows the cited arXiv/Annals listing; year is the arXiv v3 year (2024), while final journal publication details are pending.

§ Tags