A Paper Was Accepted by JMLR
Graduate student Fanghui Liu's paper "Generalization Properties of hyper-RKHS and its Applications" (by Fanghui Liu, Lei Shi, Xiaolin Huang, Jie Yang, Johan A.K. Suykens) was accepted by Journal of Machine Learning Research, the top journal in the field of machine learning.
This paper generalizes regularized regression problems in a hyper-reproducing kernel Hilbert space (hyper-RKHS), illustrates its utility for kernel learning and out-of-sample extensions, and prove asymptotic convergence results for the introduced learning models in an approximation theory view. Algorithmically, it considers two regularized regression models with bivariate forms in this space, including kernel ridge regression (KRR) and support vector regression (SVR) endowed with hyper-RKHS, and further combine divide-and-conquer with Nystrom approximation for scalability in large sample cases. This framework is general: the underlying kernel is learned from a broad class, and can be positive definite or not, which adapts to various requirements in kernel learning. Theoretically, it studies the convergence behavior of regularized regression algorithms in hyper-RKHS and derive the learning rates, which goes beyond the classical analysis on RKHS due to the non-trivial independence of pairwise samples and the characterisation of hyper-RKHS. Experimentally, results on several benchmarks suggest that the employed framework is able to learn a general kernel function form an arbitrary similarity matrix, and thus achieves a satisfactory performance on classification tasks.
（RevisedTime：2021-06-16 17:48 Views：75）
Copyright @2018 Institute of Image Processing and Pattern Recognition, Shanghai Jiao Tong University.
Address: 2-427 SEIEE Building, 800 Dong Chuan Rd, Shanghai, 200240, P.R.China