search for




 

Deep multiple kernel least squares support vector regression using PSO
Journal of the Korean Data & Information Science Society 2019;30:671-9
Published online May 31, 2019;  https://doi.org/10.7465/jkdi.2019.30.3.671
© 2019 Korean Data and Information Science Society.

Jooyong Shim1 · Insuk Sohn2 · Kyungha Seok3

13Department of Statistics, Inje University, 2Statistics and Data Center, Samsung Medical Center
Correspondence to: Professor, Institute of Statistical Information, Department of Statistics, Inje University, Gyungnam 50834, Korea. statseok@inje.ac.kr
This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology (NRF-2018R1D1A1B07042349, NRF-2017R1D1A1B03029792 and NRF-2017R1E1A1A01075541).
Received March 11, 2019; Revised April 8, 2019; Accepted April 15, 2019.
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract
In this paper, we propose a deep multiple kernel least squares support vector regression (DMK-LSSVR) using particle swarm optimization (PSO). Unlike multilayer neural networks (MNN), each LSSVR in the DMK-LSSVR is trained to minimize the penalized objective function. Therefore, the learning of DMK-LSSVR is completely different from that of MNN that minimizes only the final objective function. In DMK-LSSVR the grid search using GCV function is used to find optimal values of hyperparameters of each LSSVR, which has a disadvantage that takes a lot of computational time. And the back propagation algorithm is used for the optimal values of weights and biases, which has a weakness to results in local minima. In DMK-LSSVR which utilizes PSO (DMK-LSSVR-PSO), we find the optimal values of hyperparameters of LSSVRs, weights and biases using PSO in one process. Using PSO, the only needed on hyperparameters are the lower and upper bound, and estimating weights and biases results in global minimizers. Numerical studies show that DMK-LSSVR-PSO has advantages over DMK-LSSVR and other machine learning models that use back propagation algorithm and grid search for regression problems.
Keywords : Back propagation algorithm, deep neural network, generalized cross valida tion function, grid search, least squares support vector regression, multilayer neural network, particle swarm optimization.