NSF requires disclosure of AI tool usage in proposal preparation. Ensure you disclose the use of FindGrants' AI drafting in your application.
NSF
This project will focus on establishing a unified framework for adaptive sampling to enhance scientific machine learning algorithms. Scientific machine learning has proven to be a transformative force in advancing science and engineering. It blends the predictive capabilities of artificial intelligence (AI) with the precision of scientific models to address complex challenges beyond traditional numerical methods. It enables breakthroughs in fields as varied as healthcare and infrastructure development. A critical aspect of its importance lies in solving high-dimensional partial differential equations (PDEs), which are mathematical models central to describing phenomena like fluid dynamics, heat transfer, or electromagnetic fields. Traditional numerical methods struggle with the computational complexity of high-dimensional PDEs, but scientific machine learning dramatically reduces computation time while maintaining accuracy. As a result, this capability unlocks advancements in engineering designs and medical simulations, where such equations are prevalent. By enhancing the efficiency and affordability of research through improved adaptive sampling techniques, scientific machine learning can continue to drive innovation while also delivering practical solutions to pressing global issues like public health and energy, benefiting society at large. The project also includes a significant educational plan with three major components: (1) developing an introduction course on scientific machine learning; (2) training undergraduate and graduate students in research; (3) conducting outreach to educate high school students on basics regarding scientific computing and deep learning. The goal of this project is to establish a unified framework through adaptive sampling, aimed at simultaneously optimizing both the training set and the loss of deep learning-based techniques for solving high-dimensional (parametric) PDEs. While deep learning has achieved remarkable success in numerous AI applications as a data-driven approach, applying it to solve high-dimensional PDEs introduces an additional challenge: its performance significantly deteriorates if the chosen training set does not align well with PDE solution properties. This is similar to what occurs when one attempts to solve a low-regularity problem with a finite element method on a uniform mesh. We must optimize not only the loss function but also the selection of random samples in the training set. From a numerical perspective, we must balance the statistical error induced by random samples and the approximation error induced by the neural network model. Optimizing the selection of random samples requires a generic density model capable of approximating arbitrary distributions and generating samples efficiently. A good candidate for this is a deep generative model. In this project, we will further develop two normalizing flow models: KRnet, suitable for distributions with dimensions on the order of 10, and VAE-KRnet, designed for distributions with dimensions on the order of 1000. Using these deep generative models, we will develop adaptive sampling strategies that reduce statistical error when solving (parametric) PDEs with physics-informed neural networks (PINNs) or Deep Ritz method. In particular, we will address two important problems in physics and chemistry: the simulation of viscoelastic flow and the approximation of the committor function. Due to the curse of dimensionality, these problems are traditionally tackled using stochastic approaches. However, deep learning offers a promising alternative approach where the estimated physical quantities do not suffer from the stochastic fluctuation. Adaptive sampling, enabled by deep generative models, will play a critical role in the algorithms developed for these problems. The educational objectives will focus on training young scientists in tackling interdisciplinary problems across scientific computing and deep learning. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Up to $286K
2028-06-30
Detailed requirements not yet analyzed
Have the NOFO? Paste it below for AI-powered requirement analysis.
One-time $749 fee · Includes AI drafting + templates + PDF export
Research Infrastructure: National Geophysical Facility (NGF): Advancing Earth Science Capabilities through Innovation - EAR Scope
NSF — up to $26.6M
AmLight: The Next Frontier Towards Discovery in the Americas and Africa
NSF — up to $9M
EPSCoR CREST Phase I: Center for Energy Technologies
NSF — up to $7.5M
CREST Phase II Center for Complex Materials Design
NSF — up to $7.5M
EPSCoR CREST Phase I: Center for Post-Transcriptional Regulation
NSF — up to $7.5M
EPSCoR CREST Phase I: Center for Semiconductors Research
NSF — up to $7.5M