Document Type

Report

Date

4-11-2011

Keywords

Wyner's common information, lossy source coding, continuous random variables, Gaussian random variables

Language

English

Disciplines

Computer Sciences

Description/Abstract

Wyner’s common information can be easily generalized for continuous random variables. We provide an operational meaning for such generalization using the Gray-Wyner network with lossy source coding. Specifically, a Gray-Wyner network consists of one encoder and two decoders. A sequence of independent copies of a pair of random variables (X, Y ) ~ p(x, y) is encoded into three messages, one of them is a common input to both decoders. The two decoders attempt to reconstruct the two sequences respectively subject to individual distortion constraints. We show that Wyner’s common information equals the smallest common message rate when the total rate is arbitrarily close to the rate-distortion function with joint decoding. A surprising observation is that such equality holds independent of the values of distortion constraints as long as the distortions are less than certain thresholds. An interpretation for such thresholds is given for the symmetric case.

Source

local

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.