Enhancing Kernel Ridge Regression Models with Compact Support Wendland Functions

Document Type : Research Article

Authors

Department of Mathematics and statistics, Behbahan Khatam Alanbia University of Technology, Khouzestan, Iran

Abstract

Radial Basis Functions (RBFs) have gained significant attention in various machine learning applications, including regression modeling, due to their ability to approximate complex, nonlinear relationships. RBFs offer a flexible approach to capturing intricate dependencies between input features and the target variable, making them particularly useful in high-dimensional and nonparametric settings. This paper investigates the use of a specific class of compactly supported RBFs, known as Wendland functions, within the framework of kernel ridge regression (KRR). We discuss their theoretical advantages—such as sparsity enforcement and computational efficiency as well as practical challenges, including parameter selection and scalability. A comprehensive overview of RBFs is provided, along with their mathematical formulation and a comparison of different RBF kernels in terms of smoothness and locality. We detail the integration of Wendland functions into KRR models, emphasizing their suitability for problems requiring robustness and interpretability. Through extensive simulation studies, the performance of the proposed approach is evaluated against conventional RBF kernels and other widely used regression techniques. Our results demonstrate that Wendland-based KRR achieves competitive accuracy while offering improved stability in the presence of noise and outliers. Furthermore, real-world case studies illustrate the effectiveness of Wendland functions in handling datasets with high collinearity, where traditional kernels often struggle. The practical implications of our findings are discussed, along with guidelines for implementation and potential extensions to large-scale or sparse data scenarios. This work contributes to the growing body of research on interpretable and efficient kernel methods, providing insights for both theoretical and applied machine learning practitioners.

Keywords

Main Subjects


 

Article PDF

[1] M. Antillon, J. Bilcke, A. D. Paltiel, and V. E. Pitzer, Cost-effectiveness analysis of typhoid conjugate vaccines in five endemic low-and middle-income settings. Vaccine, 35(27), 3506–3514, (2017).
[2] K.Y. Bak, and W. Lee, Effect of dimensionality on convergence rates of kernel ridge regression estimator. Journal of Statistical Planning and Inference, 106228, (2024).
[3] J. Barber, Sparse gaussian processes via parametric families of compactly-supported kernels. arXiv preprint arXiv:2006.03673, (2020).
[4] M.D. Buhmann, Radial Basis Functions: Theory and Implementations. Vol. 12. Cambridge University Press, 2003.
[5] C. Campbell, An introduction to kernel methods. Studies in Fuzziness and Soft Computing, 66, 155–192, (2001).
[6] A. Caponnetto, and E.D. Vito, Optimal rates for the regularized least-squares algorithm. Foundations of Computational Mathematics, 7, 331–368, (2007).
[7] J. Chachi, S.M. Taheri, and P. DUrso, Fuzzy regression analysis based on M-estimates. Expert Systems with Applications, 187, 115891, (2022).
[8] J. M. F. Chamayou, and M.E.A. El Tom, On the approximate solution of the delay integral equation of the statistical theory of radiation damage. Computer Physics Communications, 9(3), 131–140, (1975).
[9] A. Chernih, I.H. Sloan, R.S. Womersley, Wendland functions with increasing smoothness converge to a Gaussian. Advances in Computational Mathematics, 40, 185–200, (2014).
[10] A. Chernih, Multiscale Wendland radial basis functions and applications to solving partial differential equations. PhD Thesis, University of New South Wales (2013).
[11] G.E. Fasshauer, Meshfree methods. Handbook of theoretical and computational nanotechnology, 27, 33–97, (2005).
[12] L.L. Gerfo, L. Rosasco, F. Odone, E.D. Vito, and A. Verri, Spectral algorithms for supervised learning. Neural Computation, 20(7), 1873–1897, (2008).
[13] A. Kouibia Krichi, P. González Rodelas, M. Pasadas Fernández, B. Mustafa, H.O. Yakhlef, and L. Omri, Approximation of Bivariate Functions by Generalized Wendland Radial Basis Functions, (2024).
[14] K.J. Liew, K.H. Tee, A. Ramli, and W.E. Ong, Integrating clustering method in compactly supported radial basis function for surface approximation. IAENG International Journal of Computer Science, 46(1), (2019).
[15] S.B. Lin, Y. Lei, and D.X. Zhou, Boosted kernel ridge regression: Optimal learning rates and early stopping. Journal of Machine Learning Research, 20(46), 1–36, (2019).
[16] S.B. Lin, Adaptive Parameter Selection for Kernel Ridge Regression. Applied and Computational Harmonic Analysis, 101671, (2024).
[17] M. Meister, and I. Steinwart, Optimal learning rates for localized SVMs. Journal of Machine Learning Research, 17(194), 1–44, (2016).
[18] A. Melkumyan, and F.T. Ramos, A sparse covariance function for exact Gaussian process inference in large datasets. In Twenty-first international joint conference on artificial intelligence, (2009).
[19] R.B. Platte, CCompactly Supported and Positive Definite Radial Kernels. SIAM Journal on Scientific Computing, 37(4), A1934-A1956, (2015).
[20] A. Rudi, R. Camoriano, and L. Rosasco, Less is more: Nyström computational regularization. Advances in neural information processing systems, 28, (2015).
[21] R. Schaback, The missing Wendland functions. Advances in Computational Mathematics, 34(1), 67–81, (2011).
[22] W. Schempp, and K. Zeller, Constructive Theory of Functions of Several Variables [electronic resource]: Proceedings of a Conference Held at Oberwolfach April 25 May 1, (1976).
[23] G. Shabat, E. Choshen, and N. Carmel, Fast and accurate Gaussian kernel ridge regression using matrix decompositions for preconditioning. SIAM Journal on Matrix Analysis and Applications, 42(3), 1073–1095, (2021).
[24] Z. Shengxin, Compactly supported radial basis functions: how and why? OCCAM Preprint Number 12/57, (2012).
[25] K. Vu, J.C. Snyder, L. Li, M. Rupp, B.F. Chen, T. Khelif, T., ... and K. Burke, Understanding kernel ridge regression: Common behaviors from simple functions to density functionals. International Journal of Quantum Chemistry, 115(16), 1115–1128, (2015).
[26] H. Wendland, Piecewise polynomial, positive definite and compactly supported radial functions of minimal degree. Advances in Computational Mathematics, 4, 389–396, (1995).
[27] H. Wendland, Scattered data approximation, 2005.
[28] H.H. Zhang, M.G. Genton, and P. Liu, Compactly supported radial basis function kernels. North Carolina State University. Dept. of Statistics (2004).
[29] Y. Zhang, J. Duchi, and M. Wainwright, Divide and conquer kernel ridge regression: A distributed algorithm with minimax optimal rates. The Journal of Machine Learning Research, 16(1), 3299–3340, (2015).
[30] F. Zhdanov and Y. Kalnishkan, An identity for kernel ridge regression. Theoretical Computer Science, 473, 157–178, (2013). 
Volume 9, Issue 1
May 2024
Pages 34-47
  • Receive Date: 30 November 2024
  • Revise Date: 27 April 2025
  • Accept Date: 12 May 2025
  • Publish Date: 17 May 2025