An examination of least squares support vector machines (LS-SVMs) which are reformulations to standard SVMs. LS-SVMs are closely related to regularization networks and Gaussian processes but additionally emphasize and exploit primal-dual interpretations from optimization theory. The authors explain the natural links between LS-SVM classifiers and kernel Fisher discriminant analysis. Bayesian inference of LS-SVM models is discussed, together with methods for imposing sparseness and employing robust statistics. The framework is further extended towards unsupervised learning by considering PCA analysis and its kernel version as a one-class modelling problem. This leads to new primal-dual support vector machine formulations for kernel PCA and kernel CCA analysis. Furthermore, LS-SVM formulations are given for recurrent networks and control. In general, support vector machines may pose heavy computational challenges for large data sets. For this purpose, a method of fixed size LS-SVM is proposed where the estimation is done in the primal space in relation to a Nystrom sampling with active selection of support vectors. The methods are illustrated with several examples.
發表於2024-11-09
Least Squares Support Vector Machines 2024 pdf epub mobi 電子書 下載
圖書標籤:
作者親自上課……書講得挺細的
評分作者親自上課……書講得挺細的
評分作者親自上課……書講得挺細的
評分作者親自上課……書講得挺細的
評分作者親自上課……書講得挺細的
Least Squares Support Vector Machines 2024 pdf epub mobi 電子書 下載