search for


Deep multiple kernel least squares support vector regression machine
Journal of the Korean Data & Information Science Society 2018;29:895-902
Published online July 31, 2018
© 2018 Korean Data and Information Science Society.

Changha Hwang1 · Sang-Il Choi2 · Jooyong Shim3

1Department of Applied Statistics, Dankook University
2Department of Applied Computer Engineering, Dankook University
3Department of Statistics, Inje University
Correspondence to: Adjunct Professor, Department of Statistics, Institute of Statistical Information, Inje University, Kimhae, Gyeongnam 50834, Korea. E-mail: ds1631@hanmail,net
Received June 25, 2018; Revised July 12, 2018; Accepted July 17, 2018.
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License ( which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
We propose a deep multiple kernel least squares support vector regression machine (LS-SVRM) for regression, which consists of the input layer, two hidden layers and the output layer. In the hidden layer, LS-SVRMs with different kernels are trained with the inputs and the responses. For the final output, the neural network is trained with the outputs of the second hidden layer as inputs. Differently from the multilayer neural network (MNN), LS-SVRMs in the deep multiple kernel LS-SVRM are trained to minimize the penalized objective function. Thus, the learning dynamics of the deep multiple kernel LS-SVRM are totally different from MNN in which weights and biases are trained to minimize only the final cost function. The deep multiple kernel LS-SVRM trains all LS-SVRMs in the architecture and makes use of combination weights and biases. The combination weights and biases are updated by backpropogation. Numerical studies illustrate that the deep multiple kernel LS-SVRM outperforms standard LS-SVRM and MNN on regression problems.
Keywords : Backprogation algorithm, deep neural network, least squares support vector machine, multilayer neural network, penalized objective function, regression.