Online learning and modeling has attracted considerable interest due to increasingly available data in streaming manner. Nonparametric models, although flexible, have seen limited use in online settings due to their data-driven nature and high computational demands. We introduce an innovative online method for dynamically updating local polynomial regression estimates. Our approach decomposes kernel-type estimates into two sufficient statistics and approximates future optimal bandwidths with a dynamic candidate sequence. We establish asymptotic normality and efficiency lower bounds for online estimation, shedding light on the trade-off between accuracy and computational cost driven by the bandwidth sequence length. This idea extends to general nonlinear optimization problems, where we propose an online smoothing backfitting method for generalized additive models with local linear estimation. We investigate statistical and algorithmic convergence and provide a framework for balancing estimation and computation performance. Our dynamic candidate bandwidth method is also adaptable to complex structural data such as functional data. Simulations and real data examples are provided to support the usefulness of the proposed method.
学术海报3.pdf