Date of Award

12-24-2025

Date Published

January 2026

Degree Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Electrical Engineering and Computer Science

Advisor(s)

Chilukuri Mohan

Keywords

Bioinformatics;Evolutionary Algorithms;Gene Regulatory Network;Sparse Network Inference

Subject Categories

Computer Sciences | Physical Sciences and Mathematics

Abstract

This thesis explores sparse network inference from high-dimensional, noisy, and underdetermined data—a fundamental challenge in many scientific domains. We focus on the development of evolutionary computation methods for discovering underlying network structures that are both interpretable and biologically plausible. Our methods are applied to the domain of Gene Regulatory Network (GRN) inference, where sparsity, indirect interactions, and limited observations pose significant hurdles. The framework models both steady-state and time-series gene expression data, with particular emphasis on biological sparsity and regulatory dynamics. We approach the problem from three perspectives: (1) edge-level analysis using transitive reduction to distinguish direct from indirect regulation; (2) differential equation modeling to learn sparse adjacency matrices; and (3) temporal modeling via simple and damped harmonic motion to capture oscillatory behavior in regulatory updates. Within this framework, we introduce the Veto Inhibition Theory, which posits that strong self-inhibitory mechanisms underlie many of the observed periodic patterns—where for every activation (up), an immediate suppression (down) follows, enforcing rhythmic balance in gene expression. The framework incorporates both single- and multi-objective optimization using Evolution Strategies, Genetic Algorithms, and NSGA-II, equipped with custom recombination and mutation operators. We demonstrate up to 17% improvement over leading methods on benchmark datasets from the DREAM3 and DREAM4 challenges, and achieve high accuracy on a real Drosophila dataset with over 5000 genes and sparse temporal measurements. Finally, we study the structural evolution of sparse matrices during optimization, identifying patterns such as discontinuous sparsity shifts and non-monotonic quality transitions. These insights offer broader contributions to the understanding of sparse network learning in complex systems.

Access

Open Access

Share

COinS