deposit_your_work

Performance evaluation and optimization of stochastic systems via importance sampling

Files in this item

Files Size Format View
OrsakG.pdf 6.335Mb application/pdf Thumbnail

Show full item record

Item Metadata

Title: Performance evaluation and optimization of stochastic systems via importance sampling
Author: Orsak, Geoffrey Charles
Advisor: Aazhang, Behnaam
Degree: Doctor of Philosophy thesis
Abstract: Analytic solutions to determining the optimal set of system parameters and the associated performance of random input systems are typically intractable. One often only has access to the system's output under a variety of inputs, thereby requiring the optimization routine to be insensitive to the noise inherent in estimating the performance. The well known algorithms of Robbins-Monro and Kiefer-Wolfowitz asymptotically eliminate the effects of noise by effectively averaging the estimated performance over a sequence of system parameters. The convergence rate of the parameters is shown to depend on the performance estimator's variance. Consequently, implementing variance reduction techniques will greatly enhance the convergence properties of these algorithms. The Importance Sampling technique is employed to minimize the variance in estimating the system performance. A class of Importance Sampling biasing distributions is derived in this thesis for the specific use in analyzing single-user communication systems. This class is extended for use in estimating the performance of multiple-access systems. Additionally, a more general method of determining biasing densities for estimating arbitrary functionals of random vectors is obtained by using ideas from robust statistics. All of the above mentioned methods render substantial improvements over standard Monte Carlo simulations when estimating system performance. By incorporating these techniques into both the Robbins-Monro and Kiefer-Wolfowitz algorithms, we show that the stopping times for these algorithms can be significantly reduced. Moreover, the computational savings over the traditional implementation of these algorithms are unbounded when optimizing systems whose performance criteria are diminishing probabilities of a set of events. These techniques can be directly applied to enhance more specific optimization routines such as training algorithms in neural networks and recursive algorithms in system identification.
Citation: Orsak, Geoffrey Charles. (1990) "Performance evaluation and optimization of stochastic systems via importance sampling." Doctoral Thesis, Rice University. http://hdl.handle.net/1911/16379.
URI: http://hdl.handle.net/1911/16379
Date: 1990

This item appears in the following Collection(s)