PhD Dissertation Defense: Chiao-Yi Wang

Thursday, April 9, 2026
10:00 a.m.
AJC 4104 (4rd floor conference room)
Debbie Chu
301 405 8268
dgchu@umd.edu

Date: Tuesday, April 09, 2026
Time: 10:00 AM
Location: AJC 4104 (4rd floor conference room)

Title: Deep Learning and Computational Methods for Automated Biomedical Imaging, Motion Analysis, and Precision Aquaculture

Committee members:
Dr. Yang Tao, Chair
Dr. Li-Qun Zhang
Dr. Giuliano Scarcelli
Dr. Osamah Saeedi
Dr. Miao Yu, Dean's Representative

Abstract:

Medical care and aquaculture are essential to human well-being and economic sustainability, yet both domains rely heavily on labor-intensive processes that limit efficiency and scalability. In healthcare, the demand for precise and automated analysis has become increasingly critical, particularly highlighted by resource constraints during the COVID-19 pandemic. In aquaculture, labor shortages and inefficient harvesting practices continue to hinder productivity and increase operational costs. These challenges motivate the development of advanced, data-driven automation frameworks.

 

This dissertation presents a series of computer vision and deep learning–based methods designed to address key challenges in biomedical imaging, human motion analysis, and precision aquaculture. First, to enable accurate quantification of capillary-level retinal blood flow, we introduce MEMO, the first multimodal retinal image dataset combining erythrocyte-mediated angiography (EMA) and optical coherence tomography angiography (OCTA). To address the unique challenge of large vessel density discrepancies across modalities, we propose a robust segmentation-based registration framework that achieves accurate multimodal alignment with minimal annotation requirements. Building upon this, we develop EMTrack, an automated erythrocyte detection and tracking framework that enables reliable blood flow quantification by incorporating flow-aware detection and topology-aware tracking strategies.

Second, we present EgoFall, a real-time, privacy-preserving fall risk assessment system using a single chest-mounted camera. By leveraging a lightweight CNN-Transformer architecture and a carefully designed motion representation pipeline, the system accurately classifies multiple walking instability patterns while maintaining low computational complexity. The proposed EgoWalk dataset further supports the development and evaluation of such wearable vision-based healthcare systems.

Finally, we introduce ShellCollect, a smart precision aquaculture framework for optimizing shellfish harvesting. By formulating harvesting as a data collection path planning problem, we propose a Variable Neighborhood Search–based solution with an efficient merge-and-filtering strategy. We also develop ShellMapSim, a dedicated simulation platform, and validate the proposed approach through both simulation and real-world field experiments.

Collectively, this dissertation demonstrates that integrating computer vision and deep learning with domain-specific system design enables robust, data-efficient, and practical automation solutions. These contributions provide incremental advancements in biomedical imaging and precision aquaculture, and illustrate the potential of intelligent automation for real-world applications.

remind we with google calendar

 

March 2026

SU MO TU WE TH FR SA
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31 1 2 3 4
Submit an Event