The paper by Shuai Lu and his collaborators Peter Mathé, Sergei V. Pereverzev, entitled " Balancing principle in supervised learning for a general regularization scheme", will be published in the journal " Appl. Comput. Harmon. Anal." in 2020.
They discuss the problem of parameter choice in learning algorithms generated by a general regularization scheme. Such a scheme covers well-known algorithms as regularized least squares and gradient descent learning. It is known that in contrast to classical deterministic regularization methods, the performance of regularized learning algorithms is influenced not only by the smoothness of a target function, but also by the capacity of a space, where regularization is performed. In the infinite dimensional case the latter one is usually measured in terms of the effective dimension. In the context of supervised learning both the smoothness and effective dimension are intrinsically unknown a priori. Therefore they are interested in a posteriori regularization parameter choice, and they propose a new form of the balancing principle. An advantage of this strategy over the known rules such as cross-validation based adaptation is that it does not require any data splitting and allows the use of all available labeled data in the construction of regularized approximants. They provide the analysis of the proposed rule and demonstrate its advantage in simulations.
220 Handan Rd., Yangpu District, Shanghai （ 200433 ） | Operator：+86 21 65642222
Copyright © 2016 FUDAN University. All Rights Reserved