文章詳目資料

Journal of Computers EIMEDLINEScopus

  • 加入收藏
  • 下載文章
篇名 Impr-Co-Forest: The Improved Co-forest Algorithm Based on Optimized Decision Tree and Dual-confidence Estimation Method
卷期 30:6
作者 Fei-Fei HouWen-Tai LeiHong LiJing-Wei RenGeng-Ye LiuQing-Yuan Gu
頁次 110-122
關鍵字 dual-confidence estimationImproved Co-Forest optimized decision treesemi-supervised collaborative trainingweighted voteEIMEDLINEScopus
出刊日期 201912
DOI 10.3966/199115992019123006009

中文摘要

英文摘要

Co-forest is a classical semi-supervised collaborative training algorithm. Aming at solving the unavoidable mislabeling problem and the defect of decision tree classifier with low performance. In this paper, a novel Improved Co-forest (Impr-Co-Forest) algorithm is proposed to address the above issues. First, considering the creation and selection of optimized decision trees, in training process, we applied a weight scheme into adaboost method in order to pay more attention to the samples that are difficult to classify. Because newly labeled data are not always valid, the Out-Of-Bag-Error (OOBE) of single decision tree is compared with the decision trees with smaller OOBE. Then, to solve the mislabeling problem throughout the colabeling iterations, a dual-confidence estimation method is proposed for the newly labeled data that is effectively chosen to update classifiers. Finally, the weighted vote, which replaces the simple majority vote, achieves the estimation of probability for each class. Experimental results on eight UCI datasets and Pascal VOC show that the proposed Impr-Co-Forest algorithm has better classification performance than both supervised and semi-supervised algorithms.

本卷期文章目次

相關文獻