Multiterminal Source Coding under Logarithmic Loss
Electrical and Computer Engineering
|By: ||Thomas Courtade|
|From: ||University of California, Los Angeles|
|When: ||Tuesday, May 1, 2012|
4:00 PM - 5:00 PM
|Where: ||Duncan Hall|
|Abstract: ||The fundamental tradeoff between the size to which data can be compressed and the fidelity to which it can be reproduced is known as rate distortion theory. Claude Shannon gave a precise characterization of this tradeoff in a 1959 paper when compression is performed by a single encoder. The multiterminal source coding problem considers the extension of this setting to two encoders. That is, if correlated sources are available at two separate encoders, what is the tradeoff between the compression rate at each encoder and the fidelity to which each source can be reproduced? This problem has remained largely open since it was posed nearly four decades ago.
In this talk, we characterize the rate distortion tradeoff for any pair of finite-alphabet sources when the reproduction fidelity is measured under logarithmic loss. In doing so, we prove that a natural extension of Shannon’s original compression scheme to the two-encoder setting is optimal. Notably, this constitutes the first such solution to the multiterminal source coding problem where finite-alphabet sources are subjected to nontrivial distortion constraints. In addition to the main result, applications to machine learning, estimation, and combinatorics will be discussed.
Host: Ashutosh Sabharwal
|Thomas Courtade received the B.S. degree in Electrical Engineering from Michigan Technological University in 2007, his M.S. degree in Electrical Engineering from UCLA in 2008, and is currently pursuing his Ph.D. there. While at UCLA, he has been the recipient of the UCLA Dean’s fellowship, the UCLA University fellowship, and the UCLA Dissertation Year Fellowship. He also received an Excellence in Teaching Award from the Department of Electrical Engineering in 2011. His research activities are presently in the area of multiuser information theory, with a particular emphasis on distributed source coding and information exchange.|