Is Margin Sufficient for Non-Interactive Private Distributed Learning?
Posed by Amit Daniely et al. (2019)
§ Problem Statement
Setup
Let be a domain and let be a class of Boolean functions . For , let . The margin complexity of , denoted , is the minimum such that there exist a dimension and an embedding with the property that for every there exists a vector satisfying
(Non-interactive local differential privacy.) Let . An -local randomizer is a randomized map such that for all and all measurable events ,
A non-interactive -LDP (distribution-independent) PAC learner for works as follows: given i.i.d. labeled examples where for an arbitrary distribution over and for some unknown target , each example is accessed only once by applying a fixed (non-adaptive) -local randomizer to produce messages ; the learner then outputs a hypothesis .
Unsolved Problem
Problem 1. Does there exist a polynomial such that for every class , every accuracy parameter , and every distribution over , there is a non-interactive -LDP PAC learner that, using
examples, outputs (with probability at least over its randomness and the sample) a hypothesis satisfying for every target ?
§ Discussion
§ Significance & Implications
This problem asks whether a single geometric complexity parameter, , suffices to guarantee efficient (polynomial-sample) distribution-independent PAC learning in the strict non-interactive -LDP model, where each user sends only one privatized message and the server cannot adapt queries across users. A positive answer would give a broadly applicable, model-specific characterization of which Boolean classes admit one-round locally private learning with sample complexity controlled by and . A negative answer would demonstrate that polynomial margin complexity alone does not capture the information constraints of non-interactive local privacy, implying that either additional structure/complexity measures or interaction is inherently necessary for some classes even when is small.
§ Known Partial Results
Daniely et al. (2019): Daniely and Feldman (COLT 2019) formulate the question of whether polynomial margin complexity implies efficient non-interactive locally differentially private PAC learning (distribution-independently) for Boolean concept classes.
Daniely et al. (2019): The open problem isolates margin complexity as the candidate controlling parameter and fixes the learning model to the strongest common non-interactivity requirement: one privatized message per example with no adaptivity across examples.
§ References
Open Problem: Is Margin Sufficient for Non-Interactive Private Distributed Learning?
Amit Daniely, Vitaly Feldman (2019)
Conference on Learning Theory (COLT), PMLR 99
📍 Open-problem note in COLT proceedings.
Open Problem: Is Margin Sufficient for Non-Interactive Private Distributed Learning? (PDF)
Amit Daniely, Vitaly Feldman (2019)
Conference on Learning Theory (COLT), PMLR 99
📍 Proceedings PDF.