NSF AI Disclosure Required
NSF requires disclosure of AI tool usage in proposal preparation. Ensure you disclose the use of FindGrants' AI drafting in your application.
CAREER: Optimism in Causal Reasoning via Information-theoretic Methods
NSF
About This Grant
Reasoning about the causes and effects of phenomena is a fundamental problem in the development of artificial intelligence. Causal reasoning from data also plays a key role in several disciplines, from engineering and computer science to medical research. A formal mathematical theory of probabilistic causation has been developed in the last few decades by Pearl (1995). Several algorithms that illustrate how much qualitative and quantitative causal knowledge can be extracted from data under well-defined assumptions have been proposed within this formalism. These algorithms employ a worst-case view: if the answer to a causal question is not unique, they return that the result is not identifiable. However, such an approach is unsuitable for many real-world systems that violate these crucial assumptions to varying degrees. The investigator argues that it is possible to significantly expand the applicability of causality theory by identifying simple causal explanations in the data that are unlikely to occur by chance. This project will extend the theory of causation to a much wider set of real-world instances by enabling causal reasoning for most models rather than in the worst case. To expand the scope of the state-of-the-art causal reasoning formalism, the investigator will develop novel algorithms that identify information-theoretically simple explanations of the underlying causal system from data. The first thrust seeks to develop methods to learn causal relations from observational data via an information-theoretic interpretation of Occam’s razor based on the entropy of the causal system. A second thrust will analyze how information-theoretically simple explanations can help approximately compute causal effects that are not identifiable in the worst case. A third thrust will leverage the results of the first two thrusts to develop experimental design algorithms for efficiently learning causal structures and causal effects via interventions. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Focus Areas
Eligibility
How to Apply
Up to $207K
2027-12-31
One-time $749 fee · Includes AI drafting + templates + PDF export
AI Requirement Analysis
Detailed requirements not yet analyzed
Have the NOFO? Paste it below for AI-powered requirement analysis.