NSF requires disclosure of AI tool usage in proposal preparation. Ensure you disclose the use of FindGrants' AI drafting in your application.
NSF
This project seeks to make it easier for designers to add realistic and expressive touch sensations (i.e., haptic feedback), such as vibrations and textures, to digital experiences such as virtual reality, gaming, smartphones, and medical training systems. Currently, creating effective haptic feedback requires specialized knowledge in areas like human perception, data processing, and haptic hardware, which limits who can design these experiences. This research aims to lower those barriers by developing tools that allow designers to use everyday language, such as the descriptions "a rough brick wall" or "a gentle heartbeat", to guide the creation and refinement of haptic effects. By simplifying this process, this work could help expand the use of virtual touch in a wide range of applications and make designing interactive systems more accessible for a wide range of designers. The project will also produce open datasets and software tools that can be used by other researchers and developers. This research will develop language-guided, iterative methods for generating haptic models, such as surface textures, and vibrotactile signals to enable human-AI co-creation of touch experiences. Building on prior work in iterative haptic texture modeling through evolutionary search, the project will integrate natural language processing techniques to allow designers to specify initial physical descriptions (e.g., "a smooth ceramic tile") and provide corrective language feedback (e.g., "make it rougher") to refine the models. The technical scope includes: (1) creating large, open datasets that pair physical object texture models and vibration signals with natural language annotations describing their properties or intended emotional valence; (2) designing a language-conditioned, interactive, generative framework that combines pretrained language models with haptic texture generation and search; and (3) conducting human-subject experiments to evaluate the efficiency, expressiveness, and perceptual fidelity of the resulting authoring tools. Expected outcomes include new algorithms for language-conditioned haptic content generation, two publicly available datasets to accelerate future research, and empirical insights into language-guided sensory design. This work advances the state of the art in haptics and AI, and lays a foundation for broader applications in human-computer interaction where intuitive and accessible multimodal authoring is critical. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Up to $850K
2029-09-30
Detailed requirements not yet analyzed
Have the NOFO? Paste it below for AI-powered requirement analysis.
One-time $749 fee · Includes AI drafting + templates + PDF export