Classical Gaussian, Markov, and Poisson models have played a vital role in the remarkable success of statistical signal processing. However, a host of signals---images, network traffic, financial times series, seismic measurements, wind turbulence, and others---exhibit properties beyond the scope of classical models, properties that are crucial to analysis and processing of these signals. These properties include a heavy-tailed marginal probability distribution, a nonlinear dependency structure, and a slowly-decaying or nonstationary correlation function. Fourier, wavelet, and related transforms have demonstrated a remarkable ability to decorrelate and simplify signals with these properties. Although useful transform-domain algorithms have been developed for signal analysis and processing, realistic transform-domain statistical models have not.
In this thesis, we develop several new statistical models for signals in the transform-domain with an eye towards developing improved algorithms for tasks such as noise removal, synthesis, classification, segmentation, and compression. We primarily focus on the wavelet transform, with its efficient multiresolution tree structure, and the Fourier transform. However, the theory, which is rooted in topics such as probabilistic graphs, hidden Markov models, and fractals, can be applied in a much more general setting. Our models have led to new algorithms for signal estimation, segmentation, and synthesis as well as to new insights into the behavior of data network traffic, insights potentially useful for network design and control.