deposit_your_work

Tuning support vector machines for minimax and Neyman-Pearson classification

Files in this item

Files Size Format View
ecsvm_preprint.pdf 554.9Kb application/pdf Thumbnail

Show full item record

Item Metadata

Title: Tuning support vector machines for minimax and Neyman-Pearson classification
Author: Scott, Clayton D.; Baraniuk, Richard G.; Davenport, Mark A.
Type: Report
Citation: C. D. Scott, R. G. Baraniuk and M. A. Davenport, "Tuning support vector machines for minimax and Neyman-Pearson classification," Rice University ECE Technical Report, no. TREE 0804, 2008.
Abstract: This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM parameters, such as crossvalidation, can lead to poor classifier performance. To address this issue, we first prove that the usual cost-sensitive SVM, here called the 2C-SVM, is equivalent to another formulation called the 2nu-SVM. We then exploit a characterization of the 2nu-SVM parameter space to develop a simple yet powerful approach to error estimation based on smoothing. In an extensive experimental study we demonstrate that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains. Furthermore, we propose coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.
Date Published: 2008-08-19

This item appears in the following Collection(s)

  • DSP Publications [508 items]
    Publications by Rice Faculty and graduate students in digital signal processing.