#### Title

Quantized Consensus by the Alternating Direction Method of Multipliers: Algorithms and Applications

#### Date of Award

August 2017

#### Degree Type

Dissertation

#### Degree Name

Doctor of Philosophy (PhD)

#### Department

Electrical Engineering and Computer Science

#### Advisor(s)

Biao Chen

#### Keywords

alternating direction method of multipliers (ADMM), distributed average consensus, distributed detection, finite-bit bounded quantizer, quantized consensus

#### Subject Categories

Engineering

#### Abstract

Collaborative in-network processing is a major tenet in the fields of control, signal processing, information theory, and computer science. Agents operating in a coordinated fashion can gain greater efficiency and operational capability than those perform solo missions. In many such applications the central task is to compute the global average of agents' data in a distributed manner. Much recent attention has been devoted to quantized consensus, where, due to practical constraints, only quantized communications are allowed between neighboring nodes in order to achieve the average consensus. This dissertation aims to develop efficient quantized consensus algorithms based on the alternating direction method of multipliers (ADMM) for networked applications, and in particular, consensus based detection in large scale sensor networks.

We study the effects of two commonly used uniform quantization schemes, dithered and deterministic quantizations, on an ADMM based distributed averaging algorithm. With dithered quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM either converges to a consensus or cycles with a finite period after a finite-time iteration. In the cyclic case, local quantized variables have the same sample mean over one period and hence each node can also reach a consensus. We then obtain an upper bound on the consensus error, which depends only on the quantization resolution and the average degree of the network. This is preferred in large scale networks where the range of agents' data and the size of network may be large.

Noticing that existing quantized consensus algorithms, including the above two, adopt infinite-bit quantizers unless a bound on agents' data is known a priori, we further develop an ADMM based quantized consensus algorithm using finite-bit bounded quantizers for possibly unbounded agents' data. By picking a small enough ADMM step size, this algorithm can obtain the same consensus result as using the unbounded deterministic quantizer. We then apply this algorithm to distributed detection in connected sensor networks where each node can only exchange information with its direct neighbors. We establish that, with each node employing an identical one-bit quantizer for local information exchange, our approach achieves the optimal asymptotic performance of centralized detection. The statement is true under three different detection frameworks: the Bayesian criterion where the maximum a posteriori detector is optimal, the Neyman-Pearson criterion with a constant type-I error constraint, and the Neyman-Pearson criterion with an exponential type-I error constraint. The key to achieving optimal asymptotic performance is the use of a one-bit deterministic quantizer with controllable threshold that results in desired consensus error bounds.

#### Access

Open Access

#### Recommended Citation

Zhu, Shengyu, "Quantized Consensus by the Alternating Direction Method of Multipliers: Algorithms and Applications" (2017). *Dissertations - ALL*. 781.

https://surface.syr.edu/etd/781