Skip to main content

Collaborative Research: Endowing Aerial Robots with Ground Mobility and Multimodal Perception for Autonomous Roof Inspection

NSF

open

About This Grant

This project addresses urgent safety, efficiency, and cost challenges in the roof inspection industry by developing a robotic platform that combines the mobility of drones with the terrain adaptability of legged robots. Roof inspection remains one of the most hazardous construction tasks in the United States, with a significant portion of injuries and fatalities attributed to falls and unstable surfaces. By equipping aerial robots with legged mobility and advanced perception capabilities, this project enables safe, detailed inspection on sloped roofs, reducing risk to human inspectors while improving access and inspection quality in challenging environments. The hybrid aerial-hexapod robot autonomously conducts detailed inspections by integrating visual, tactile, and light-detection and ranging (LiDAR) data to detect structural anomalies, surface degradation, and moisture intrusion. The resulting robot can seamlessly switch between flight mode and legged mode to navigate multi-layered and irregular roof structures, supporting scalable and task-specific operations. The project also offers impactful educational and outreach opportunities, including summer STEM workshops for K-12 students and teachers, as well as open-access datasets for robotics and artificial intelligence education. The research team collaborates with industry partners to ensure the system addresses real-world operational needs and facilitates technology transfer. This research addresses the scientific challenge of enabling detailed, autonomous roof inspection using a hybrid robotic platform capable of operating both in flight and on the ground. The project’s goals are threefold: (1) to develop an integrated robot with dual-mode mobility and multimodal perception capabilities; (2) to design algorithms that can interpret sensory data in real time for autonomous navigation and condition assessment; and (3) to validate the system’s performance through extensive experimental evaluation in both laboratory and real-world settings. To achieve these goals, the research team designs a lightweight legged mobility system that attaches to a quadrotor platform, enabling the robot to transition seamlessly between flight and stable ground locomotion. A modular sensor suite – including an RGB-D camera, a LiDAR scanner, and footpad-embedded tactile sensors – is developed to enable multimodal perception for the robot. Each sensing modality is selectively activated based on the complexity of the inspection task, enabling energy-efficient operation across diverse inspection scenarios. The team develops artificial intelligence (AI)-based algorithms to fuse data across modalities, build a unified representation of the inspection environment, and extract high-level semantic and geometric features for roof condition assessment. The research further explores intelligent control strategies to leverage these features for real-time decision-making and affordance-driven control to enable safe and efficient navigation and inspection on complex roof structures. Experimental evaluation follows a multi-phase strategy that includes high-fidelity simulations, controlled laboratory tests, and field deployments on residential and commercial roof structures. Finally, the project aims to advance foundational knowledge in robotics by pushing the state of the art in sensor fusion, multimodal perception, and robotic mobility. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Focus Areas

education

Eligibility

universitynonprofitsmall business

How to Apply

Funding Range

Up to $350K

Deadline

2028-09-30

Complexity
Medium
Start Application

One-time $749 fee · Includes AI drafting + templates + PDF export

AI Requirement Analysis

Detailed requirements not yet analyzed

Have the NOFO? Paste it below for AI-powered requirement analysis.

0 characters (min 50)