NSF requires disclosure of AI tool usage in proposal preparation. Ensure you disclose the use of FindGrants' AI drafting in your application.
NSF
Ambiguous language is a common part of communication. It means using vague words or phrases that can be interpreted in multiple ways depending on the context. This project addresses how a question answering system might handle ambiguous questions about images where it is unclear which part of an image a question refers to. For example, if someone asks “What is the medicine?” while looking at an image showing several pill bottles, a system should identify all relevant parts of the image and provide answers for each so that a person receives the full picture and can resolve ambiguities later. Instead, current visual question answering (VQA) services typically provide people with one answer per question and do not explain their reasoning process for choosing the answer. This limits a person’s ability to verify whether the desired interpretation was made. The possible repercussions from VQA services providing incomplete information can be grave, inflicting adverse personal, social, professional, legal, and financial consequences to VQA service users. The researchers will develop a socio-technical solution to address the need for innovative solutions that empower people to recognize when there is question ambiguity, and then resolve it. The project introduces the first back-end AI model that can specify every plausible image region that could be the focus of a question's language paired with natural language answers derived from those regions. The project will establish effective interaction designs within a user-facing tool that empowers people to recognize and resolve focus ambiguity in visual questions. Progress will be measured by evaluating the proposed AI model on a benchmark dataset and examining real users’ experiences with this model when embedded within a larger VQA system. User studies will focus on blind individuals, since they are the current dominant end-user for VQA services. More generally though, project success will benefit all VQA service users, whether visually impaired or sighted. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Up to $285K
2027-08-31
Detailed requirements not yet analyzed
Have the NOFO? Paste it below for AI-powered requirement analysis.
One-time $49 fee · Includes AI drafting + templates + PDF export
AI Institute for Student AI Teaming
NSF — up to $8M
GSS: The General Social Survey, Data Platform, Methodological Innovation, and Dissemination Enhancement
NSF — up to $8.0M
ANES: Data Products, Instrumentation, and Methodological Innovations
NSF — up to $5.2M
Institute for Pure and Applied Mathematics
NSF — up to $5M
STEM STARs: A Partnership to Build Persistence to Math-Intensive Degrees in Low-Income Students
NSF — up to $5.0M
Support for the International Institute for Applied Systems Analysis (IIASA)
NSF — up to $4.2M