The work of the group is generally organized around theoretical and algorithmic themes, and we work with great collaborators to have technological impact. Three signal processing concepts permeate a majority of the work: quantization, sampling, and sparsity.
A prevailing view of quantization is as a necessary evil in digital information acquisition and representation. We see beauty in quantization, especially in its interplay with communication, computation, inference, and overcomplete representations. In source coding for lossy communication channels (e.g., multiple description coding), the best quantizers are not regular, meaning that their partition cells are not convex sets; two examples are shown on the right. When quantized variables will be used in a computation, it is not necessarily good to simplify minimize the squared error introduced by quantization. When faced with a solving an ensemble of hypothesis testing problems with a small number of decisions rules, the optimal categorization of the problems can be seen as quantization of the prior probability distribution over the hypotheses. Quantization can be achieved in rather unusual ways, such as storing the permutation of a vector that puts it in descending order. Interesting questions involving quantization are everywhere.
Sampling — Imaging by Spatiotemporal Sampling of Light Fields
Light fields and diffusion fields are spatiotemporal processes that can be described
through linear partial differential equations or linear state space models. We have very recently demonstrated that the capture of transient light
field properties, beyond mere time of flight, can generate dramatic effects such as forming an image of a surface that is in the line of sight of neither the illumination source nor the sensor—without a mirror. This opens up many avenues for new imaging technologies. Establishing theoretical foundations
for this nascent field has not only an aesthetic appeal and the potential for lasting impact, but is needed to achieve anything beyond coarse reconstructions. Sampling of diffusion processes has some existing theory to leverage, but much more can be done to exploit the structure of these processes.
Exploiting sparsity has become a central theme in signal processing over the past two decades. A signature example is that the advantages of wavelet representations over Fourier representations in imaging processing come from sparsity. Fourier and wavelet representations are very similar in their coefficient correlations. The advantages of the wavelet representations for approximation, compression, and estimation come from the same root: approximate sparsity. We have contributed fundamental results on the use of sparse signal models in compression and estimation, and we have used sparsity extensively in several applications including communication, MRI excitation design, and MR image formation.
Technology and Pedagogy
The goal in any engineering research should be to aid good
engineering, specifically the design of objects and processes
for the betterment of the human condition. While we strive
to advance technology, at the same time we embrace the additional
opportunities that come from being at an educational institution.
We make some of our contribution by illuminating topics we
find important to non-specialists. And we take the time to
work beyond the point of having mathematical proof to also
have clear, intuitive, and visual demonstrations.
Generalizations of lattice quantization
Pseudolinear labelings for entropy coding and multiple description
Acquisition of hidden scene properties by using time-varying light sources and time-sampling detectors
Packets sent in waves for multicast congestion control