An Experiment to Improve Classification Accuracy Using Ensemble Methods

Authors(2) :-Bhavesh Patankar, Dr. Vijay Chavda

Data mining is the practice of analyzing huge quantities of data and shortening it into constructive knowledge. Data Mining is an eternal process which is quite useful in finding understandable patterns and relationships amongst the data. There are various classification techniques available. It is observed that all the techniques don't work well with all datasets. It is found that when the classifiers are used alone, they are not performing as good as when they are combined using ensembles. Ensemble methods are renowned techniques in order to improve the classification accuracy. Bagging and Boosting are the most common ensemble learning techniques used to improve the classification accuracy. Here, a study on the classification accuracy improvement is carried out in which an experiment is performed using boosting with different datasets from UCI repository.

Authors and Affiliations

Bhavesh Patankar
Department of M.Sc. (IT), Kadi SarvaVishwaVidyalaya, Gandhinagar, Gujarat, India.
Dr. Vijay Chavda
NPCCSM, Kadi SarvaVishwaVidyalaya, Gandhinagar, Gujarat, India

Data mining; classification; ensemble learning;boosting, Adaboost;

  1. Han, Jiawei, and Micheline Kamber. "Data mining: concepts and techniques (the Morgan Kaufmann Series in data management systems)." (2000).
  2. Dietterich, Thomas G. "Ensemble methods in machine learning." Multiple classifier systems. Springer Berlin Heidelberg, 2000. 1-15.
  3. Tu, My Chau, Dongil Shin, and Dongkyoo Shin. "A comparative study of medical data classification methods based on decision tree and bagging algorithms." Dependable, Autonomic and Secure Computing, 2009. DASC'09. Eighth IEEE International Conference on. IEEE, 2009.
  4. Kittler, Josef, et al. "On combining classifiers." Pattern Analysis and Machine Intelligence, IEEE Transactions on 20.3 (1998): 226-239.
  5. Nguyen Thai Nghe, P. Janecek, and P. Haddawy, "A comparative analysis of techniques for predicting academic performance", ASEE/IEEE Frontiers in Education Conference, pp. T2G7-T2G12, 2007.
  6. Dietterich, Thomas G. "An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization." Machine learning 40.2 (2000): 139-157.
  7. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119-139 (1997)
  8. Oza, N.C. AveBoost2: Boosting with Noisy Data. In F. Roli , J. Kittler, and T. Windeatt (Eds.), Proceedings of the Fifth International Workshop on Multiple Classifier Systems, 31-40, Berlin, 2004.

Publication Details

Published in : Volume 1 | Issue 2 | May-June 2015
Date of Publication : 2015-07-05
License:  This work is licensed under a Creative Commons Attribution 4.0 International License.
Page(s) : 94-97
Manuscript Number : IJSRST151234
Publisher : Technoscience Academy

Print ISSN : 2395-6011, Online ISSN : 2395-602X

Cite This Article :

Bhavesh Patankar, Dr. Vijay Chavda, " An Experiment to Improve Classification Accuracy Using Ensemble Methods ", International Journal of Scientific Research in Science and Technology(IJSRST), Print ISSN : 2395-6011, Online ISSN : 2395-602X, Volume 1, Issue 2, pp.94-97, May-June-2015.
Journal URL :

Article Preview