267 IGPM267.pdf        December 2006
TITLE Universal Piecewise Polynomial Estimators for Machine Learning
AUTHORS Peter Binev, Albert Cohen, Wolfgang Dahmen, Ronald DeVore
ABSTRACT We review and expand somewhat on some recent developments concerning the construction and analysis of piecewise polynomial estimators for the regression problem in Mathematical Learning Theory. The discussion will center on two issues. The first of these is computational efficiency including possible online capability. The second is universality by which we mean the capability of the estimator to give rise to optimal convergence rates for a possibly wide class of prior classes without using any a-priori knowledge on the memebership of the regression function to any of these classes. More precisely, the main point of interest are estimators for which the probability of exceeding an optimal rate tends to zero as the number m of observed data increases. We focus on nonlinear methods built on piecewise polynomial approximation on adaptively refined partitions. We describe a class of schemes that are inspired by thresholding concepts for wavelet expansions. We point out obstacles to treating piecewise polynomials of degree higher than one as compared with piecewise constant estimators and discuss several possible remedies.
KEYWORDS Regression, universal piecewise polynomial estimators, complexity regularization, optimal convergence rates in probability, adaptive partitioning, thresholding
PUBLICATION Curve and surface design : Avignon 2006 / ed. by Patrick Chenin ...
Modern methods in mathematics 48-77 (2007)