Title
Wyner’s Common Information for Continuous Random Variables - A Lossy Source Coding Interpretation
Document Type
Report
Date
4-11-2011
Keywords
Wyner's common information, lossy source coding, continuous random variables, Gaussian random variables
Language
English
Disciplines
Computer Sciences
Description/Abstract
Wyner’s common information can be easily generalized for continuous random variables. We provide an operational meaning for such generalization using the Gray-Wyner network with lossy source coding. Specifically, a Gray-Wyner network consists of one encoder and two decoders. A sequence of independent copies of a pair of random variables (X, Y ) ~ p(x, y) is encoded into three messages, one of them is a common input to both decoders. The two decoders attempt to reconstruct the two sequences respectively subject to individual distortion constraints. We show that Wyner’s common information equals the smallest common message rate when the total rate is arbitrarily close to the rate-distortion function with joint decoding. A surprising observation is that such equality holds independent of the values of distortion constraints as long as the distortions are less than certain thresholds. An interpretation for such thresholds is given for the symmetric case.
Recommended Citation
Xu, Ge; Liu, Wei; and Chen, Biao, "Wyner’s Common Information for Continuous Random Variables - A Lossy Source Coding Interpretation" (2011). Electrical Engineering and Computer Science - Technical Reports. 48.
https://surface.syr.edu/eecs_techreports/48
Source
local