Wyner's common information, Gray-Wyner source coding network, distribution approximation, circularly symmetric binary source
This paper generalizes Wyner’s definition of common information of a pair or random variables to that of N random variables. We prove coding theorems that show the same operational meanings for the common information of two random variables generalize to that of N random variables. As a byproduct of our proof, we show that the Gray-Wyner source coding network can be generalized to N source sequences with N decoders. We also establish a monotone property of Wyner’s common information which is in contrast to other notions of the common information, specifically Shannon’s mutual information and Gács and Körner’s common randomness. Examples about the computation of Wyner’s common information of N random variables are also given.
Liu, Wei and Xu, Ge, "The Common Information for N Dependent Random Variables" (2011). Electrical Engineering and Computer Science Technical Reports. 49.