Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2024, Cilt: 42 Sayı: 2, 578 - 589, 30.04.2024

Öz

Kaynakça

  • REFERENCES
  • [1] Cortes C, Vapnik V. Support-vector networks. Mach Learn 1995;20:273297. [CrossRef]
  • [2] Karal Ö. Compression of ECG data by support vector regression method. J Fac Eng Archit Gazi Univ 2018;33:743755. [CrossRef]
  • [3] Karadurmuş E, Göz E, Taşkın N, Yüceer M. Bromate removal prediction in drinking water by using the least squares support vector machine (LS-SVM). Sigma J Eng Nat Sci 2020;38:21452153.
  • [4] Filiz E, Ersoy ÖZ. Educational data mining methods for TIMSS 2015 mathematics success: Turkey case. Sigma J Eng Nat Sci 2020;38:963977.
  • [5] Bakay MS, Ağbulut Ü. Electricity production-based forecasting of greenhouse gas emissions in Turkey with deep learning, support vector machine and artificial neural network algorithms. J Clean Prod 2021;285:125324. [CrossRef]
  • [6] Mir AA, Çelebi FV, Alsolai H, Qureshi SA, Rafique M, Alzahrani JS, et al. Anomalies prediction in radon time series for earthquake likelihood using machine learning based ensemble model. IEEE Access 2022;10:1. [CrossRef]
  • [7] Sonmez ME, Eczacıoglu N, Gumuş NE, Aslan MF, Sabanci K, Aşikkutlu B. Convolutional neural network-Support vector machine-based approach for classification of cyanobacteria and chlorophyta microalgae groups. Algal Res 2022;61:102568. [CrossRef]
  • [8] Vapnik, VN, Vapnik V. Statistical Learning Theory. New York: Wiley; 1998.
  • [9] Smola AJ, Schölkopf B. A tutorial on support vector regression. Stat Comput 2004;14:199222. [CrossRef]
  • [10] Suykens JA, Vandewalle J. Least squares support vector machine classifiers. Neural Process Lett 1999;9:293300. [CrossRef]
  • [11] Saunders C, Gammerman A, Vovk V. Ridge regression learning algorithm in dual variables. In: Proceedings of the Fifteenth International Conference on Machine Learning (ICML); 1998 Jul 24-27; Madison, Wisconsin, USA. 1998.
  • [12] Suykens JA, Lukas L, Vandewalle J. Sparse approximation using least squares support vector machines. In: Circuits and Systems, 2000. Proceedings. ISCAS 2000 Geneva. The 2000 IEEE International Symposium; 2000 Feb; Geneva. 2000. pp.757760.
  • [13] Suykens JA, De Brabanter J, Lukas L, Vandewalle J.bWeighted least squares support vector machines: robustness and sparse approximation. Neurocomput 2002;48:85105. [CrossRef]
  • [14] De Kruif BJ, De Vries TJ. Pruning error minimization in least squares support vector machines. IEEE Trans Neural Netw 2003;14:696702. [CrossRef]
  • [15] Kuh A, De Wilde P. Comments on pruning error minimization in least squares support vector machines. IEEE Trans Neural Netw 2007;18:606609. [CrossRef]
  • [16] Hoegaerts L, Suykens JA, Vandewalle J, De Moor B. (2004, November). A comparison of pruning algorithms for sparse least squares support vector machines. In: International Conference on Neural Information Processing; 2004 Nov; Heidelberg, Berlin: Springer; 2004. pp.12471253. [CrossRef]
  • [17] Zeng X, Chen XW. SMO-based pruning methods for sparse least squares support vector machines. IEEE Trans Neural Netw 2005;16:15411546. [CrossRef]
  • [18] Zhao Y, Sun J. Recursive reduced least squares support vector regression. Pattern Recognit 2009;42:837842. [CrossRef]
  • [19] Zhao YP, Sun JG, Du ZH, Zhang ZA, Zhang YC, Zhang HB. An improved recursive reduced least squares support vector regression. Neurocomput 2012;87:19. [CrossRef]
  • [20] Zhao YP, Wang KK, Li F. A pruning method of refining recursive reduced least squares support vector regression. Inf Sci 2015;296:160174. [CrossRef]
  • [21] Si G, Shi J, Guo Z, Jia L, Zhang Y. Reconstruct the support vectors to improve LSSVM sparseness for Mill Load prediction. Math Probl Eng 2017;2017:112.
  • [22] Sun B, Ng WW, Chan PP. Improved sparse LSSVMS based on the localized generalization error model. Int J Mach Learn Cybern 2017;8:18531861. [CrossRef]
  • [23] Espinoza M, Suykens JA, Moor BD. Fixed-size least squares support vector machines: a large scale application in electrical load forecasting. Comput Manag Sci 2006;3:113129. [CrossRef]
  • [24] Mall R, Suykens JA. Sparse reductions for fixed-size least squares support vector machines on large scale data. In: Pei, J., Tseng, V.S., Cao, L., Motoda, H., Xu, G., editors. Pacific-Asia Conference on Knowledge Discovery and Data Mining. Heidelberg, Berlin: Springer; 2013. [CrossRef]
  • [25] Yang L, Yang S, Zhang R, Jin H. Sparse least square support vector machine via coupled compressive pruning. Neurocomput 2014;131:7786. [CrossRef]
  • [26] Zhou S. Sparse LSSVM in primal using Cholesky factorization for large-scale problems. IEEE Trans Neural Netws Learn Sys 2015;27:783795. [CrossRef] [27] Chen L, Zhou S. Sparse algorithm for robust LSSVM in primal space. Neurocomput 2018;275:28802891. [CrossRef]
  • [28] Xia XL. Training sparse least squares support vector machines by the QR decomposition. Neural Netw 2018;106:175184. [CrossRef]
  • [29] Ma Y, Liang X, Sheng G, Kwok JT, Wang M, Li G. Noniterative sparse LS-SVM based on globally representative point selection. IEEE Trans Neural Netw Learn Sys 2020;32:788798. [CrossRef]
  • [30] Karal Ö. Piecewise affine and support vector models for robust and low complex regression (Doctoral dissertation). İzmir: DEÜ Fen Bilimleri Enstitüsü; 2011.
  • [31] Li K, Peng JX, Bai EW. A two-stage algorithm for identification of nonlinear dynamic systems. Automatica 2006;42:11891197. [CrossRef]
  • [32] Jiao L, Bo L, Wang L. Fast sparse approximation for least squares support vector machine. IEEE Trans Neural Netw 2007;18:685697. [CrossRef]
  • [33] Fan RE, Chen PH, Lin CJ. Working set selection using second order information for training support vector machines. J Mach Learn Res 2005;6:18891918.
  • [34] Keerthi SS, Shevade SK. SMO algorithm for least-squares SVM formulations. Neural Comput 2003;15:487507. [CrossRef]
  • [35] Karal O. Maximum likelihood optimal and robust Support Vector Regression with lncosh loss function. Neural Netw 2017;94:112. [CrossRef]
  • [36] Armin M, Gholinia M, Pourfallah M, Ranjbar AA. Investigation of the fuel injection angle/time on combustion, energy, and emissions of a heavy-duty dual-fuel diesel engine with reactivity control compression ignition mode. Energy Rep 2021;7:52395247. [CrossRef]
  • [37] Zadeh MN, Pourfallah M, Sabet S, Gholinia M, Mouloodi S, Ahangar AT. Performance assessment and optimization of a helical Savonius wind turbine by modifying the Bach’s section. SN Appl Sci 2021;3:111. [CrossRef]

Comparative performance analysis of epsilon-insensitive and pruningbased algorithms for sparse least squares support vector regression

Yıl 2024, Cilt: 42 Sayı: 2, 578 - 589, 30.04.2024

Öz

Least Squares Support Vector Regression (LSSVR) which is a least squares version of the Sup-port Vector Regression (SVR) is defined with a regularized squared loss without epsilon-in-sensitiveness. LSSVR is formulated in the dual space as a linear equality constrained quadratic minimization which can be transformed into solution of a linear algebraic equation system. As a consequence of this system where the number of Lagrange multipliers is half that of classical SVR, LSSVR has much less time consumption compared to the classical SVR. De-spite this computationally attractive feature, it lacks the sparsity characteristic of SVR due to epsilon-insensitiveness. In LSSVR, every (training) input data is treated as a support vector, yielding extremely poor generalization performance. To overcome these drawbacks, the epsi-lon-insensitive LSSVR with epsilon-insensitivity at quadratic loss, in which sparsity is directly controlled by the epsilon parameter, is derived in this paper. Since the quadratic loss is sensi-tive to outliers, its weighted version (epsilon insensitive WLSSVR) has also been developed. Finally, the performances of epsilon-insensitive LSSVR and epsilon-insensitive WLSSVR are quantitatively compared in detail with those commonly used in the literature, pruning-based LSSVR and weighted pruning-based LSSVR. Experimental results on simulated and 8 differ-ent real-life data show that epsilon-insensitive LSSVR and epsilon-insensitive WLSSVR are superior in terms of computation time, generalization ability, and sparsity.

Kaynakça

  • REFERENCES
  • [1] Cortes C, Vapnik V. Support-vector networks. Mach Learn 1995;20:273297. [CrossRef]
  • [2] Karal Ö. Compression of ECG data by support vector regression method. J Fac Eng Archit Gazi Univ 2018;33:743755. [CrossRef]
  • [3] Karadurmuş E, Göz E, Taşkın N, Yüceer M. Bromate removal prediction in drinking water by using the least squares support vector machine (LS-SVM). Sigma J Eng Nat Sci 2020;38:21452153.
  • [4] Filiz E, Ersoy ÖZ. Educational data mining methods for TIMSS 2015 mathematics success: Turkey case. Sigma J Eng Nat Sci 2020;38:963977.
  • [5] Bakay MS, Ağbulut Ü. Electricity production-based forecasting of greenhouse gas emissions in Turkey with deep learning, support vector machine and artificial neural network algorithms. J Clean Prod 2021;285:125324. [CrossRef]
  • [6] Mir AA, Çelebi FV, Alsolai H, Qureshi SA, Rafique M, Alzahrani JS, et al. Anomalies prediction in radon time series for earthquake likelihood using machine learning based ensemble model. IEEE Access 2022;10:1. [CrossRef]
  • [7] Sonmez ME, Eczacıoglu N, Gumuş NE, Aslan MF, Sabanci K, Aşikkutlu B. Convolutional neural network-Support vector machine-based approach for classification of cyanobacteria and chlorophyta microalgae groups. Algal Res 2022;61:102568. [CrossRef]
  • [8] Vapnik, VN, Vapnik V. Statistical Learning Theory. New York: Wiley; 1998.
  • [9] Smola AJ, Schölkopf B. A tutorial on support vector regression. Stat Comput 2004;14:199222. [CrossRef]
  • [10] Suykens JA, Vandewalle J. Least squares support vector machine classifiers. Neural Process Lett 1999;9:293300. [CrossRef]
  • [11] Saunders C, Gammerman A, Vovk V. Ridge regression learning algorithm in dual variables. In: Proceedings of the Fifteenth International Conference on Machine Learning (ICML); 1998 Jul 24-27; Madison, Wisconsin, USA. 1998.
  • [12] Suykens JA, Lukas L, Vandewalle J. Sparse approximation using least squares support vector machines. In: Circuits and Systems, 2000. Proceedings. ISCAS 2000 Geneva. The 2000 IEEE International Symposium; 2000 Feb; Geneva. 2000. pp.757760.
  • [13] Suykens JA, De Brabanter J, Lukas L, Vandewalle J.bWeighted least squares support vector machines: robustness and sparse approximation. Neurocomput 2002;48:85105. [CrossRef]
  • [14] De Kruif BJ, De Vries TJ. Pruning error minimization in least squares support vector machines. IEEE Trans Neural Netw 2003;14:696702. [CrossRef]
  • [15] Kuh A, De Wilde P. Comments on pruning error minimization in least squares support vector machines. IEEE Trans Neural Netw 2007;18:606609. [CrossRef]
  • [16] Hoegaerts L, Suykens JA, Vandewalle J, De Moor B. (2004, November). A comparison of pruning algorithms for sparse least squares support vector machines. In: International Conference on Neural Information Processing; 2004 Nov; Heidelberg, Berlin: Springer; 2004. pp.12471253. [CrossRef]
  • [17] Zeng X, Chen XW. SMO-based pruning methods for sparse least squares support vector machines. IEEE Trans Neural Netw 2005;16:15411546. [CrossRef]
  • [18] Zhao Y, Sun J. Recursive reduced least squares support vector regression. Pattern Recognit 2009;42:837842. [CrossRef]
  • [19] Zhao YP, Sun JG, Du ZH, Zhang ZA, Zhang YC, Zhang HB. An improved recursive reduced least squares support vector regression. Neurocomput 2012;87:19. [CrossRef]
  • [20] Zhao YP, Wang KK, Li F. A pruning method of refining recursive reduced least squares support vector regression. Inf Sci 2015;296:160174. [CrossRef]
  • [21] Si G, Shi J, Guo Z, Jia L, Zhang Y. Reconstruct the support vectors to improve LSSVM sparseness for Mill Load prediction. Math Probl Eng 2017;2017:112.
  • [22] Sun B, Ng WW, Chan PP. Improved sparse LSSVMS based on the localized generalization error model. Int J Mach Learn Cybern 2017;8:18531861. [CrossRef]
  • [23] Espinoza M, Suykens JA, Moor BD. Fixed-size least squares support vector machines: a large scale application in electrical load forecasting. Comput Manag Sci 2006;3:113129. [CrossRef]
  • [24] Mall R, Suykens JA. Sparse reductions for fixed-size least squares support vector machines on large scale data. In: Pei, J., Tseng, V.S., Cao, L., Motoda, H., Xu, G., editors. Pacific-Asia Conference on Knowledge Discovery and Data Mining. Heidelberg, Berlin: Springer; 2013. [CrossRef]
  • [25] Yang L, Yang S, Zhang R, Jin H. Sparse least square support vector machine via coupled compressive pruning. Neurocomput 2014;131:7786. [CrossRef]
  • [26] Zhou S. Sparse LSSVM in primal using Cholesky factorization for large-scale problems. IEEE Trans Neural Netws Learn Sys 2015;27:783795. [CrossRef] [27] Chen L, Zhou S. Sparse algorithm for robust LSSVM in primal space. Neurocomput 2018;275:28802891. [CrossRef]
  • [28] Xia XL. Training sparse least squares support vector machines by the QR decomposition. Neural Netw 2018;106:175184. [CrossRef]
  • [29] Ma Y, Liang X, Sheng G, Kwok JT, Wang M, Li G. Noniterative sparse LS-SVM based on globally representative point selection. IEEE Trans Neural Netw Learn Sys 2020;32:788798. [CrossRef]
  • [30] Karal Ö. Piecewise affine and support vector models for robust and low complex regression (Doctoral dissertation). İzmir: DEÜ Fen Bilimleri Enstitüsü; 2011.
  • [31] Li K, Peng JX, Bai EW. A two-stage algorithm for identification of nonlinear dynamic systems. Automatica 2006;42:11891197. [CrossRef]
  • [32] Jiao L, Bo L, Wang L. Fast sparse approximation for least squares support vector machine. IEEE Trans Neural Netw 2007;18:685697. [CrossRef]
  • [33] Fan RE, Chen PH, Lin CJ. Working set selection using second order information for training support vector machines. J Mach Learn Res 2005;6:18891918.
  • [34] Keerthi SS, Shevade SK. SMO algorithm for least-squares SVM formulations. Neural Comput 2003;15:487507. [CrossRef]
  • [35] Karal O. Maximum likelihood optimal and robust Support Vector Regression with lncosh loss function. Neural Netw 2017;94:112. [CrossRef]
  • [36] Armin M, Gholinia M, Pourfallah M, Ranjbar AA. Investigation of the fuel injection angle/time on combustion, energy, and emissions of a heavy-duty dual-fuel diesel engine with reactivity control compression ignition mode. Energy Rep 2021;7:52395247. [CrossRef]
  • [37] Zadeh MN, Pourfallah M, Sabet S, Gholinia M, Mouloodi S, Ahangar AT. Performance assessment and optimization of a helical Savonius wind turbine by modifying the Bach’s section. SN Appl Sci 2021;3:111. [CrossRef]
Toplam 37 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Klinik Kimya
Bölüm Research Articles
Yazarlar

Ömer Karal 0000-0001-8742-8189

Yayımlanma Tarihi 30 Nisan 2024
Gönderilme Tarihi 6 Mayıs 2022
Yayımlandığı Sayı Yıl 2024 Cilt: 42 Sayı: 2

Kaynak Göster

Vancouver Karal Ö. Comparative performance analysis of epsilon-insensitive and pruningbased algorithms for sparse least squares support vector regression. SIGMA. 2024;42(2):578-89.

IMPORTANT NOTE: JOURNAL SUBMISSION LINK https://eds.yildiz.edu.tr/sigma/