University of Tasmania
OzAI2001.pdf (424.17 kB)

Gradient Descent Style Leveraging of Decision Trees and Stumps for Misclassification Cost Performance

Download (424.17 kB)
conference contribution
posted on 2023-05-26, 07:42 authored by Cameron-Jones, RM
This paper investigates the use, for the task of classifier learning in the presence of misclassification costs, of some gradient descent style leveraging approaches to classifier learning: Schapire and Singer's AdaBoost.MH and AdaBoost.MR [16], and Collins et al's multi-class logistic regression method [4], and some modifications that retain the gradient descent style approach. Decision trees and stumps are used as the underlying base classifiers, learned from modified versions of Quinlan's C4.5 [15]. Experiments are reported comparing the performance, in terms of average cost, of the modified methods to that of the originals, and to the previously suggested "Cost Boosting" methods of Ting and Zheng [21] and Ting [18], which also use decision trees based upon modified C4.5 code, but do not have an interpretation in the gradient descent framework. While some of the modifications improve upon the originals in terms of cost performance for both trees and stumps, the comparison with tree-based Cost Boosting suggests that out of the methods first experimented with here, it is one based on stumps that has the most promise.


Publication status

  • Published

Event title

AI 2001: Advances in Artificial Intelligence, 14th Australian Joint Conference on Artificial Intelligence

Event Venue

Adelaide, Australia

Date of Event (Start Date)


Date of Event (End Date)


Rights statement

The original publication is available at

Repository Status

  • Open

Usage metrics

    University Of Tasmania


    No categories selected


    Ref. manager