deposit_your_work

CORT: Classification Or Regression Trees

Files in this item

Files Size Format View
Sco2003Apr5CORTClass.PDF 408.9Kb application/pdf Thumbnail

Show full item record

Item Metadata

Title: CORT: Classification Or Regression Trees
Author: Scott, Clayton; Willett, Rebecca; Nowak, Robert David
Type: Conference Paper
Keywords: classification; multiscale; risk; CART
Citation: C. Scott, R. Willett and R. D. Nowak,"CORT: Classification Or Regression Trees," in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP),
Abstract: In this paper we challenge three of the underlying principles of CART, a well know approach to the construction of classification and regression trees. Our primary concern is with the penalization strategy employed to prune back an initial, overgrown tree. We reason, based on both intuitive and theoretical arguments, that the pruning rule for classification should be different from that used for regression (unlike CART). We also argue that growing a treestructured partition that is specifically fitted to the data is unnecessary. Instead, our approach to tree modeling begins with a nonadapted (fixed) dyadic tree structure and partition, much like that underlying multiscale wavelet analysis. We show that dyadic trees provide sufficient flexibility, are easy to construct, and produce near-optimal results when properly pruned. Finally, we advocate the use of a negative log-likelihood measure of empirical risk. This is a more appropriate empirical risk for non-Gaussian regression problems, in contrast to the sum-of-squared errors criterion used in CART regression.
Date Published: 2003-04-20

This item appears in the following Collection(s)

  • ECE Publications [1032 items]
    Publications by Rice University Electrical and Computer Engineering faculty and graduate students
  • DSP Publications [508 items]
    Publications by Rice Faculty and graduate students in digital signal processing.