Login / Signup

Learning via variably scaled kernels.

C CampiF MarchettiE Perracchione
Published in: Advances in computational mathematics (2021)
We investigate the use of the so-called variably scaled kernels (VSKs) for learning tasks, with a particular focus on support vector machine (SVM) classifiers and kernel regression networks (KRNs). Concerning the kernels used to train the models, under appropriate assumptions, the VSKs turn out to be more expressive and more stable than the standard ones. Numerical experiments and applications to breast cancer and coronavirus disease 2019 (COVID-19) data support our claims. For the practical implementation of the VSK setting, we need to select a suitable scaling function. To this aim, we propose different choices, including for SVMs a probabilistic approach based on the naive Bayes (NB) classifier. For the classification task, we also numerically show that the VSKs inspire an alternative scheme to the sometimes computationally demanding feature extraction procedures.
Keyphrases