Now showing items 1-15 of 15

  • The 2nu-SVM: A Cost-Sensitive Extension of the nu-SVM 

    Davenport, Mark A. (2005-12-01)
    Standard classification algorithms aim to minimize the probability of making an incorrect classification. In many important applications, however, some kinds of errors are more important than others. In this report we ...
  • Controlling False Alarms with Support Vector Machines 

    Davenport, Mark A.; Baraniuk, Richard G.; Scott, Clayton D. (2006-05-01)
    We study the problem of designing support vector classifiers with respect to a Neyman-Pearson criterion. Specifically, given a user-specified level alpha, 0 < alpha < 1, how can we ensure a false alarm rate no greater than ...
  • Detection and estimation with compressive measurements 

    Baraniuk, Richard G.; Davenport, Mark A.; Wakin, Michael B. (2006-11-01)
    The recently introduced theory of compressed sensing enables the reconstruction of sparse or compressible signals from a small set of nonadaptive, linear measurements. If properly chosen, the number of measurements can ...
  • Error control for support vector machines 

    Davenport, Mark A. (2007)
    In binary classification there are two types of errors, and in many applications these may have very different costs. We consider two learning frameworks that address this issue: minimax classification, where we seek to ...
  • Learning minimum volume sets with support vector machines 

    Davenport, Mark A.; Baraniuk, Richard G.; Scott, Clayton D. (2006-09-01)
    Given a probability law P on d-dimensional Euclidean space, the minimum volume set (MV-set) with mass beta , 0 < beta < 1, is the set with smallest volume enclosing a probability mass of at least beta. We examine the use ...
  • Minimax support vector machines 

    Davenport, Mark A.; Baraniuk, Richard G.; Scott, Clayton D. (2007-08-01)
    We study the problem of designing support vector machine (SVM) classifiers that minimize the maximum of the false alarm and miss rates. This is a natural classification setting in the absence of prior information regarding ...
  • Multiscale random projections for compressive classification 

    Duarte, Marco F.; Davenport, Mark A.; Wakin, Michael B.; Laska, Jason N.; Takhar, Dharmpal; Kelly, Kevin F.; Baraniuk, Richard G. (2007-09-01)
    We propose a framework for exploiting dimension-reducing random projections in detection and classification problems. Our approach is based on the generalized likelihood ratio test; in the case of image classification, ...
  • Random observations on random observations: Sparse signal acquisition and processing 

    Davenport, Mark A. (2010)
    In recent years, signal processing has come under mounting pressure to accommodate the increasingly high-dimensional raw data generated by modern sensing systems. Despite extraordinary advances in computational power, ...
  • Regression level set estimation via cost-sensitive classification 

    Scott, Clayton D.; Davenport, Mark A. (2007-06-01)
    Regression level set estimation is an important yet understudied learning task. It lies somewhere between regression function estimation and traditional binary classification, and in many cases is a more appropriate setting ...
  • A simple proof of the restricted isometry property for random matrices 

    Baraniuk, Richard G.; Davenport, Mark A.; DeVore, Ronald A.; Wakin, Michael B. (2007-01-18)
    We give a simple technique for verifying the Restricted Isometry Property (as introduced by Candès and Tao) for random matrices that underlies Compressed Sensing. Our approach has two main ingredients: (i) concentration ...
  • Single-pixel imaging via compressive sampling 

    Duarte, Marco F.; Davenport, Mark A.; Takhar, Dharmpal; Laska, Jason N.; Sun, Ting; Kelly, Kevin F.; Baraniuk, Richard G. (2008-03-01)
  • The smashed filter for compressive classification and target recognition 

    Davenport, Mark A.; Duarte, Marco F.; Wakin, Michael B.; Laska, Jason N.; Takhar, Dharmpal; Kelly, Kevin F.; Baraniuk, Richard G. (2007-01-01)
    The theory of compressive sensing (CS) enables the reconstruction of a sparse or compressible image or signal from a small set of linear, non-adaptive (even random) projections. However, in many applications, including ...
  • Sparse Signal Detection from Incoherent Projections 

    Davenport, Mark A.; Wakin, Michael B.; Duarte, Marco F.; Baraniuk, Richard G. (2006-05-01)
    The recently introduced theory of Compressed Sensing (CS) enables the reconstruction or approximation of sparse or compressible signals from a small set of incoherent projections; often the number of projections can be ...
  • A Theoretical Analysis of Joint Manifolds 

    Davenport, Mark A.; Hegde, Chinmay; Duarte, Marco; Baraniuk, Richard G. (2009-01)
    The emergence of low-cost sensor architectures for diverse modalities has made it possible to deploy sensor arrays that capture a single event from a large number of vantage points and using multiple modalities. In many ...
  • Tuning support vector machines for minimax and Neyman-Pearson classification 

    Scott, Clayton D.; Baraniuk, Richard G.; Davenport, Mark A. (2008-08-19)
    This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive ...