Date of Award

6-27-2025

Date Published

August 2025

Degree Type

Thesis

Degree Name

Master of Science (MS)

Department

Civil and Environmental Engineering

Advisor(s)

Elizabeth Carter

Keywords

Edge-AI Computing, Flood Monitoring, Image Co-Registration, Multispectral and Thermal Imaging

Subject Categories

Environmental Sciences | Physical Sciences and Mathematics | Water Resource Management

Abstract

Urban pluvial flooding occurs when local precipitation intensity exceeds the capacity of natural and engineered drainage systems, resulting in dangerous and highly localized inundation that is not captured by traditional stream gaging networks. While satellite imagery and flood maps support regional-scale flood management, a critical gap remains in the real-time, spatially continuous, in-situ monitoring of urban surface flooding. To address this monitoring gap, we develop a low-power, camera-based distributed sensor network flood mapping platform, Urban Flood Observation Network (UFO-Net), featuring two co-mounted sensors: a multispectral optical (capturing RGB and Near-Infrared) and a long-wave infrared (thermal) camera. A five-band image (RGB, NIR, LWIR) is constructed by aligning and fusing optical and thermal images using an optimized registration pipeline implemented on a Raspberry Pi 3B. This iterative method applies a similarity transform via Mattes Mutual Information in SimpleITK to find the optimized alignment of images. This step is vital because a well-calibrated and co-registered flood detection system enhances performance, reliability, and utility, allowing precise data interpretation and fusion to monitor and manage flood events effectively. Data-driven image co-registration is computationally expensive, increasing the latency of data processing and reducing the lifetime of battery-dependent sensor nodes. To reduce edge-computational costs associated with camera co-registration, we introduce the concept of a safety margin, which is a threshold that defines the maximum allowable camera movement using a low-power inertial measurement unit, before re-registration is required. This margin is derived empirically by analyzing alignment error (offset and RMSE) under controlled pitch, yaw, and roll variations. The inertial measurement unit monitors the camera’s position in real time and triggers transformation updates when motion exceeds the safety margin. This approach minimizes computational load while ensuring registration accuracy. Overall, this thesis presents a robust and replicable co-registration framework that enables real-time, edge-based flood monitoring, laying the groundwork for future research of distributed camera networks to enhance urban resilience.

Access

Open Access

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.