Date of Award
8-22-2025
Date Published
September 2025
Degree Type
Thesis
Degree Name
Master of Science (MS)
Department
Electrical Engineering and Computer Science
Advisor(s)
Biao Chen
Keywords
Kullback-Leibler Divergence;Supervised dimensionality reduction;Undersampled applications
Abstract
Supervised dimensionality reduction (SDR) is a critical field in machine learning. The ability to compress data with high dimensionality can lead to better data visualization, pattern recognition, and classification performance. In order to develop SDR algorithms with the widest possible applicability, it is imperative to formulate approaches based on simple, generalizable models. The general Gaussian assumption provides a model that allows for closed-form expressions of information-theoretical criteria which can lead to analytical SDR solutions. Since Gaussian populations are completely defined by their first and second moments, solutions for this model can also lead to simple projection schemes with interpretable results that generalize well outside of their assumed framework. The first part of this thesis develops two linear projection methods to maximize the Kullback- Leibler divergence (KLD) under the general Gaussian model for the binary classification problem. Each method caters to a different parameter regime. One method is devised for the case in which the class means dominate the differentiation between the distributions. The other method focuses on the situation where the covariance dissimilarity provides most of the discrimination information. The second part of this thesis focuses on addressing the current limitations of existing SDR methods for binary classification. In particular, the asymmetric treatment of the two classes and the inability to handle under-sampled datasets are addressed in two novel algorithms developed under a max-min framework. The first algorithm optimizes over the KLD, balancing both the forward and reverse KLD to improve performance. The second algorithm seeks to extremize the class variances and is shown to be well suited for undersampled scenarios.
Access
Open Access
Recommended Citation
Kortje, Joshua, "SUPERVISED DIMENSIONALITY REDUCTION TECHNIQUES FOR UNDERSAMPLED APPLICATIONS" (2025). Theses - ALL. 990.
https://surface.syr.edu/thesis/990
