NSF AI Disclosure Required
NSF requires disclosure of AI tool usage in proposal preparation. Ensure you disclose the use of FindGrants' AI drafting in your application.
Collaborative Research: SaTC: CORE: Small: SHIELD: Enabling Multi-modal Distributed Learning over Aerial Networks
NSF
About This Grant
Unmanned Aerial Vehicles (UAVs) are becoming increasingly vital for applications such as disaster response, environmental monitoring, infrastructure inspection, and cybersecurity. These airborne platforms can collect diverse types of data in real time, offering valuable input for training high-performance machine learning models. However, conventional machine learning techniques often rely on centralized training paradigms that require transmitting all raw data from UAVs to centralized servers: an approach that is often impractical due to privacy concerns, limited bandwidth, and latency constraints. To address these challenges, this project introduces a novel distributed learning framework based on federated learning, which enables UAVs to collaboratively train models while keeping their raw data local. In particular, the project’s novelties are centered on enabling multi-modal federated learning across UAV networks, where each UAV may observe a distinct combination of data modalities/types (e.g., imagery, environmental readings, or network traffic). This data modality imbalance introduces new challenges in learning coordination, model convergence, and system-level optimization. To this end, the proposed research develops a unified approach that addresses modality imbalance, adversarial threats, and system-level heterogeneity in computation, communication, and storage. The project's broader significance and importance will thus lie in advancing the resilience and reliability of distributed AI systems in airborne platforms, particularly in time-sensitive, resource-constrained, and adversarial environments. Also, beyond its technical impact, the project supports national workforce development by integrating its findings into university curricula, hosting public workshops and seminars, and providing hands-on research opportunities for undergraduate and graduate students in UAV-based sensing and communication, multi-modal federated learning, and cyber-physical system security. This project pioneers the paradigm of secure multi-modal federated learning over UAV networks. First, it establishes mechanisms to mitigate modality-level heterogeneity in multi-modal federated learning, where UAVs possess different combinations of data types, by tuning local learning rates, optimizing modality scheduling, and designing convergence-aware local model gradient adjustment techniques. Along this direction, the project introduces adaptive resource allocation strategies that account for the storage, computational, and communication disparities across UAVs, including energy-aware multi-modal learning schedules, data migration protocols for non-private data, and fine-grained modality-aware batch size selection. Second, the project introduces modality-aware differential privacy and robust model aggregation schemes to defend UAV-enabled multi-modal federated learning against privacy leakage and model poisoning, incorporating wavelet-based noise calibration and modality-aware attack detection scores. Third, the project proposes jamming-resilient UAV-enabled multi-modal federated learning approaches through channel hopping and trajectory design strategies that respond to active jamming and passive eavesdropping threats by leveraging UAV mobility and spectral awareness. These three research thrusts collectively form the pillars of Secure, Heterogeneous, Intelligent, Efficient, and Learning-driven Design (SHIELD) framework for UAV-based federated learning. The research is validated through simulations, a real-world UAV testbed, and collaboration with national labs. This project will enable safe and efficient deployment of distributed multi-modal intelligence over UAV systems, enhancing national capabilities in security, disaster response, and autonomous sensing and surveillance. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Focus Areas
Eligibility
How to Apply
Up to $600K
2028-09-30
One-time $749 fee · Includes AI drafting + templates + PDF export
AI Requirement Analysis
Detailed requirements not yet analyzed
Have the NOFO? Paste it below for AI-powered requirement analysis.