Journal of Mathematical Sciences and Applications
ISSN (Print): 2333-8784 ISSN (Online): 2333-8792 Website: https://www.sciepub.com/journal/jmsa Editor-in-chief: Prof. (Dr.) Vishwa Nath Maurya, Cenap ozel
Open Access
Journal Browser
Go
Journal of Mathematical Sciences and Applications. 2014, 2(2), 17-20
DOI: 10.12691/jmsa-2-2-1
Open AccessArticle

Comparison of Single and Ensemble Classifiers of Support Vector Machine and Classification Tree

Iut Tri Utami1, , Bagus Sartono2 and Kusman Sadik2

1Department of Mathematics, Tadulako University, Palu, Indonesia

2Department of Statistics, Bogor Agricultural University, Bogor, Indonesia

Pub. Date: April 11, 2014

Cite this paper:
Iut Tri Utami, Bagus Sartono and Kusman Sadik. Comparison of Single and Ensemble Classifiers of Support Vector Machine and Classification Tree. Journal of Mathematical Sciences and Applications. 2014; 2(2):17-20. doi: 10.12691/jmsa-2-2-1

Abstract

An ensemble consists of a set of individually trained classifiers (such as Support Vector Machine and Classification Tree) whose predictions are combined by an algorithm. Ensemble methods is expected to improve the predictive performance of classifier. This research aims to assess and compare performance of single and ensemble classifiers of Support Vector Machine (SVM) and Classification Tree (CT) by using simulation data. The simulation data is based on three data structures which are linearly separable, linearly nonseparable and nonlinearly separable data. The simulation data results show that SVM has the ability to classify the data better than CT. Ensemble method improve classification performance and more stable than single classifier. This was due to ensemble SVM has the smallest percentage of the average misclassification rate and standard deviation.

Keywords:
classification tree support vector machine ensemble method

Creative CommonsThis work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

References:

[1]  Breiman, L., “Bagging Predictors,” Machine Learning, 24. 123-140. 1996.
 
[2]  Breiman, L., Friedman, J.H., Olshen, R.A. and Stone, C.J, Classification and Regression Trees, Chapman and Hall, New York, 1993.
 
[3]  Cortes, C. and Vapnik, V., “Support-Vector Networks,” Machine Learning 20(3). 273-297. 1995.
 
[4]  Cristianini, N. and Shawe-Taylor, J., An introduction to Support Vector Machines and other kernel-based learning methods, Cambridge University Press, 2000. [E-book] Available: libgen.org.
 
[5]  Dietterich, T.G., “An experimental comparison of three methods for constucting ensembles of decision trees: Bagging, boosting and randomization,” Machine Learning, 40 (2). 139-158. 2000.
 
[6]  Hansen, L.K. & Salamon, P., “Neural Network Ensembles,” IEEE Trans. Pattern Analysis and Machine Intelligence, 12(10): 993-1001. 1990.
 
[7]  Opitz, D. and Maclin, R., “Popular Ensemble Methods: An Empirical Study,” Journal Of Articial Intelligence Research, 11. 169-198. 1999.
 
[8]  Valentini, G. and Dietterich, T.G., “Bias–variance analysis of Support Vector Machines for the development of SVM-based ensemble methods,” Journal of Machine Learning Research, 1. 1-48. 2000.
 
[9]  Wang, S.J., Mathew, A., Chen, Y., Xi, L.F., Ma, L. And Lee, J., “Empirical analysis of support vector machine ensemble classifiers,” Expert Systems with Applications, 36. 6466–6476. 2009.