We consider the problem of testing whether an unknown low-degree polynomial $p$ over $\mathbb{R}^n$ is sparse versus far from sparse, given access to noisy evaluations of the polynomial $p$ at \emph{randomly chosen points}. This is a property-testing analogue of classical problems on learning sparse low-degree polynomials with noise, extending the work of Chen, De, and Servedio (2020) from noisy \emph{linear} functions to general low-degree polynomials.
Our main result gives a \emph{precise characterization} of when sparsity testing for low-degree polynomials admits constant sample complexity independent of dimension, together with a matching constant-sample algorithm in that regime. For any mean-zero, variance-one finitely supported distribution $\boldsymbol{X}$ over the reals, degree $d$, and any sparsity parameters $s \leq T$, we define a computable function $\mathrm{MSG}_{\boldsymbol{X},d}(\cdot)$, and:
Our techniques employ a generalization of the results of Dinur et al. (2007) on the Fourier tails of bounded functions over $\{0,1\}^n$ to a broad range of finitely supported distributions, which may be of independent interest.