NSF AI Disclosure Required
NSF requires disclosure of AI tool usage in proposal preparation. Ensure you disclose the use of FindGrants' AI drafting in your application.
ERI: Towards Efficient and Robust Federated Neuromorphic Learning in Wireless Edge Networks
NSF
About This Grant
Current distributed learning systems predominantly rely on artificial neural networks, which are generally energy-intensive. This issue is further exacerbated with the use of more advanced and larger models. In contrast, brain-inspired neuromorphic learning algorithms, such as spiking neural networks (SNNs), are renowned for their energy efficiency, making them particularly promising for low-power edge applications. However, research on integrating SNNs with distributed learning remains scarce, and these paradigms are not yet optimized for wireless edge environments. This project aims to advance the fundamental understanding of distributed SNN learning by addressing several unique and challenging questions: 1) How to overcome the constraints of system memory and communication bandwidth, along with system and adversarial perturbations due to channel instability and openness in wireless edge environments, impacting distributed learning? Can distributed SNNs provide advantageous solutions in those cases? 2) How to achieve the joint optimization of efficiency, robustness, and utility for distributed SNN learning over wireless edge environments? The anticipated outcome of this project contributes to unleashing the full potential of wireless edge artificial intelligence systems in various applications, including power systems and environmental monitoring, intelligent healthcare and manufacturing, and collaborative robotics. Additionally, the investigator is committed to immersing graduate, undergraduate, and K-12 students in hands-on neuromorphic edge computing and strengthening the future workforce in sustainable artificial intelligence. This project investigates federated SNNs—distributed learning systems in which each edge node trains an SNN locally and exchanges only compact model updates with a server—under severe limits on memory, bandwidth, and link stability. Three integrated objectives structure the work. (1) During local model training, shrink memory consumption while boosting fault and attack tolerance by quantizing synaptic weights and exploiting the resulting noise within spiking dynamics. (2) In the model aggregation phase, jointly reduce communication traffic and resist channel errors and attacks by blending update sparsification with quantization. (3) Offer system designers a multi-objective framework that balances efficiency, robustness, and utility through joint tuning of network architecture and system parameters. Together, these advances establish a scalable, energy-efficient, and secure foundation for distributed edge intelligence. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Focus Areas
Eligibility
How to Apply
Up to $200K
2027-09-30
One-time $749 fee · Includes AI drafting + templates + PDF export
AI Requirement Analysis
Detailed requirements not yet analyzed
Have the NOFO? Paste it below for AI-powered requirement analysis.