Author

Ge Xu

Date of Award

8-2013

Degree Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Electrical Engineering and Computer Science

Advisor(s)

Biao Chen

Keywords

Common information, Data reduction, Decentralized inference, Dependent observations, Sufficiency statistics

Subject Categories

Electrical and Computer Engineering

Abstract

Wyner's common information was originally defined for a pair of dependent discrete random variables. This thesis generalizes its definition in two directions: the number of dependent variables can be arbitrary, so are the alphabets of those random variables. New properties are determined for the generalized Wyner's common information of multiple dependent variables. More importantly, a lossy source coding interpretation of Wyner's common information is developed using the Gray-Wyner network. It is established that the common information equals to the smallest common message rate when the total rate is arbitrarily close to the rate distortion function with joint decoding if the distortions are within some distortion region.

The application of Wyner's common information to inference problems is also explored in the thesis. A central question is under what conditions does Wyner's common information capture the entire information about the inference object. Under a simple Bayesian model, it is established that for infinitely exchangeable random variables that the common information is asymptotically equal to the information of the inference object. For finite exchangeable random variables, connection between common information and inference performance metrics are also established.

The problem of decentralized inference is generally intractable with conditional dependent observations. A promising approach for this problem is to utilize a hierarchical conditional independence model. Utilizing the hierarchical conditional independence model, we identify a more general condition under which the distributed detection problem becomes tractable, thereby broadening the classes of distributed detection problems with dependent observations that can be readily solved.

We then develop the sufficiency principle for data reduction for decentralized inference. For parallel networks, the hierarchical conditional independence model is used to obtain conditions such that local sufficiency implies global sufficiency. For tandem networks, the notion of conditional sufficiency is introduced and the related theory and tools are developed. Connections between the sufficiency principle and distributed source coding problems are also explored. Furthermore, we examine the impact of quantization on decentralized data reduction. The conditions under which sufficiency based data reduction with quantization constraints is optimal are identified. They include the case when the data at decentralized nodes are conditionally independent as well as a class of problems with conditionally dependent observations that admit conditional independence structure through the hierarchical conditional independence model.

Access

Open Access

Share

COinS