Physics-Informed Neural Networks: Unlocking Next-Gen Scientific Discovery and Real-World AI
Latest 50 papers on physics-informed neural networks: Nov. 30, 2025
Physics-Informed Neural Networks (PINNs) are rapidly evolving, bridging the gap between data-driven machine learning and the immutable laws of physics. By embedding physical equations directly into neural network architectures, PINNs promise to deliver more robust, interpretable, and data-efficient solutions to complex scientific and engineering problems. Recent research showcases a burgeoning field, tackling everything from micro-scale material properties to global climate modeling and even the human body. Let’s dive into some of the latest breakthroughs that are pushing the boundaries of what PINNs can achieve.
The Big Idea(s) & Core Innovations
The overarching theme across recent PINN research is the relentless pursuit of greater accuracy, efficiency, and robustness when modeling complex systems, particularly those governed by Partial Differential Equations (PDEs). A significant hurdle has been the curse of dimensionality, as highlighted by researchers from Université de Lorraine in their paper, “The curse of dimensionality: what lies beyond the capabilities of physics-informed neural networks”, which notes the limitations of PINNs in high-dimensional settings. To overcome this, many innovative strategies are emerging.
One promising avenue involves domain decomposition and adaptive sampling. For instance, Jiaqi Luo et al. from Soochow University in “Efficient Global-Local Fusion Sampling for Physics-Informed Neural Networks” propose a Global–Local Fusion (GLF) sampling strategy that allocates more points to high-residual regions while maintaining broader exploration, boosting accuracy and efficiency. Similarly, Qiumei Huang et al. from Beijing University of Technology in “The modified Physics-Informed Hybrid Parallel Kolmogorov–Arnold and Multilayer Perceptron Architecture with domain decomposition” introduce a hybrid KAN-MLP architecture with overlapping domain decomposition to handle high-frequency and multiscale PDE problems more efficiently. Victor Dolean et al. from Eindhoven University of Technology further demonstrate this in “Neural network-driven domain decomposition for efficient solutions to the Helmholtz equation” with Finite Basis PINNs (FBPINNs) for wave propagation.
Another critical area of innovation focuses on improving PINN stability, robustness, and physical consistency. Nanxi Chen et al. from Tongji University in “Enforcing hidden physics in physics-informed neural networks” introduce an irreversibility-regularized approach to enforce hidden physical laws as soft constraints, significantly reducing predictive errors. For guaranteeing fundamental conservation laws, Anthony Baez et al. from MIT propose a novel projection method in “Guaranteeing Conservation of Integrals with Projection in Physics-Informed Neural Networks” that explicitly enforces linear and quadratic integral conservation. Obiekev and Oguadime from Oregon State University further this in “Structure-Preserving Physics-Informed Neural Network for the Korteweg–de Vries (KdV) Equation”, using sinusoidal activations and L-BFGS to maintain Hamiltonian conservation for the KdV equation. The stability of PINNs themselves is a challenge addressed by Thonn Homsnit et al. from Saitama University in “Investigation of PINN Stability and Robustness for the Euler-Bernoulli Beam Problem”, where they identify failure mechanisms and propose a BC-handling method for improved robustness.
Addressing the challenge of complex PDE features like sharp gradients, Vikas Dwivedi et al. from INSA, CNRS UMR 5220 in “Kernel-Adaptive PI-ELMs for Forward and Inverse Problems in PDEs with Sharp Gradients” introduce KAPI-ELM, which optimizes RBF kernel distribution via Bayesian optimization. For high-frequency components, J. Zheng et al. from Xiangtan University introduce FG-PINNs in “FG-PINNs: A neural network method for solving nonhomogeneous PDEs with high frequency components”, utilizing dual subnetworks and frequency-guided training.
Finally, several papers focus on computational efficiency and practical applicability. Marta Grześkiewicz from University of Cambridge shows in “Solving Heterogeneous Agent Models with Physics-informed Neural Networks” that PINNs can replace traditional grid-based solvers in macroeconomic models for improved scalability. Akshay Sai Banderwaar and Abhishek Gupta from IIT Goa achieve up to 500x speedups for eigenvalue problems with a biconvex reformulation and alternating convex search in “Fast PINN Eigensolvers via Biconvex Reformulation”. For inverse problems, Shota Deguchi and Mitsuteru Asai introduce a framework in “Reliable and efficient inverse analysis using physics-informed neural networks with normalized distance functions and adaptive weight tuning” that uses normalized distance functions and adaptive weight tuning for complex geometries.
Under the Hood: Models, Datasets, & Benchmarks
Recent advancements highlight not just new methodologies, but also tailored architectures and tools that enhance PINN performance across diverse applications.
- RRaPINNs: Introduced by Ange-Clément Akazan et al. from University of KwaZulu Natal in “RRaPINNs: Residual Risk-Aware Physics Informed Neural Networks”, this framework incorporates risk-aware optimization like Conditional Value-at-Risk (CVaR) and Mean-Excess penalties to control tail residuals in PDE solutions. Code is available at https://github.com/RRaPINNs and https://github.com/GeorgeEmKarniadakis/deepxde.
- KAPI-ELM: Vikas Dwivedi et al. from INSA, CNRS UMR 5220 developed this Kernel-Adaptive Physics-Informed Extreme Learning Machine in “Kernel-Adaptive PI-ELMs for Forward and Inverse Problems in PDEs with Sharp Gradients” to optimize RBF kernel distribution for sharp gradients in PDEs.
- WbAR: A white-box adversarial attack refinement strategy by Shengzhu Shi et al. from Harbin Institute of Technology in “PINNs Failure Region Localization and Refinement through White-box Adversarial Attack” that precisely localizes and refines PINN failure regions. Code: https://github.com/yaoli90/WbAR.
- FBPINNs: Victor Dolean et al. from Eindhoven University of Technology in “Neural network-driven domain decomposition for efficient solutions to the Helmholtz equation” propose this domain-decomposed PINN extension for Helmholtz equation solutions, showing improved accuracy with Perfectly Matched Layers (PML).
- E-PINNs: Bruno Jacob et al. from Pacific Northwest National Laboratory introduced this framework in “E-PINNs: Epistemic Physics-Informed Neural Networks” for efficient epistemic uncertainty quantification in PINNs using a lightweight ‘epinet’ architecture.
- SinoFlow: Presented by Jinyuxuan Guo et al. from University of California San Diego in “Computed Tomography (CT)-derived Cardiovascular Flow Estimation Using Physics-Informed Neural Networks Improves with Sinogram-based Training: A Simulation Study”, this framework estimates cardiovascular flow from CT data by training PINNs directly on sinograms, avoiding image reconstruction errors.
- HEATNETs: Kyriakos Georgiou et al. from University of Naples ‘Federico II’ introduced these explainable random feature neural networks in “HEATNETs: Explainable Random Feature Neural Networks for High-Dimensional Parabolic PDEs” for solving high-dimensional parabolic PDEs up to 2000 dimensions.
- PINGS-X: For efficient super-resolution of 4D flow MRI data, Sun Jo et al. from Hanyang University developed PINGS-X in “PINGS-X: Physics-Informed Normalized Gaussian Splatting with Axes Alignment for Efficient Super-Resolution of 4D Flow MRI”, leveraging normalized Gaussian splatting and axes-aligned representations. Code: https://github.com/SpatialAILab/PINGS-X.
- SSTODE: Zheng Jiang et al. from Beijing University of Posts and Telecommunications introduced SSTODE in “SSTODE: Ocean-Atmosphere Physics-Informed Neural ODEs for Sea Surface Temperature Prediction”, a physics-informed Neural ODE for sea surface temperature prediction, integrating ocean dynamics. Code: https://github.com/nicezheng/SSTODE-code.
- LieSolver: René P. Klausen et al. from Fraunhofer Heinrich Hertz Institute introduced this PDE-constrained solver in “LieSolver: A PDE-constrained solver for IBVPs using Lie symmetries” to solve initial-boundary value problems (IBVPs) by exactly enforcing Lie symmetries, offering superior performance over PINNs. Code: https://github.com/oduwancheekee/liesolver.
- HPKM-PINN: From Qiumei Huang et al. at Beijing University of Technology, this is a hybrid Kolmogorov–Arnold Network and Multilayer Perceptron architecture with domain decomposition for high-frequency and multiscale PDEs, detailed in “The modified Physics-Informed Hybrid Parallel Kolmogorov–Arnold and Multilayer Perceptron Architecture with domain decomposition”.
- PIELM: Professor Pei-Zhi Zhuang et al. from Shandong University introduced this physics-informed extreme learning machine framework in “A Rapid Physics-Informed Machine Learning Framework Based on Extreme Learning Machine for Inverse Stefan Problems” for efficient inverse Stefan problems, showing significant improvements in accuracy and speed over traditional PINNs.
Impact & The Road Ahead
The impact of these advancements is profound, promising to revolutionize scientific discovery and engineering. PINNs are moving beyond academic benchmarks into critical real-world applications. For instance, Yong-Woon Kim et al. from Jeju National University apply PINNs to “Physics-Informed Neural Networks for Real-Time Gas Crossover Prediction in PEM Electrolyzers: First Application with Multi-Membrane Validation”, achieving unprecedented accuracy and real-time performance for safety-critical hydrogen production. In semiconductor manufacturing, Rudag Uerman et al. from NeuroTechNet S.A.S. use physics-constrained adaptive neural networks for “Physics-Constrained Adaptive Neural Networks Enable Real-Time Semiconductor Manufacturing Optimization with Minimal Training Data”, enabling sub-nanometer precision with 90% fewer samples.
In healthcare, PINOs by Hannah Lydon et al. from King’s College London for “Physics-Informed Neural Operators for Cardiac Electrophysiology” offer 800x speed-up over numerical solvers, enabling zero-shot predictions for cardiac dynamics. The medical imaging field also sees innovation with SinoFlow for CT-derived cardiovascular flow estimation, as presented by Jinyuxuan Guo et al. from University of California San Diego in “Computed Tomography (CT)-derived Cardiovascular Flow Estimation Using Physics-Informed Neural Networks Improves with Sinogram-based Training: A Simulation Study”. PINNs are even being applied to articulate soft robots for model predictive control, as discussed by Author A et al. in “Generalizable and Fast Surrogates: Model Predictive Control of Articulated Soft Robots using Physics-Informed Neural Networks”, demonstrating their versatility.
The theoretical foundations are also strengthening, with papers like “Regularity and error estimates in physics-informed neural networks for the Kuramoto-Sivashinsky equation” by MOHAMMAD MAHABUBUR RAHMAN and DEEPANSHU VERMA, which establishes rigorous error estimates for complex PDEs. The newly introduced Physics-Informed Log Evidence (PILE) score by Mara Daniels et al. from MIT in “Uncertainty-Aware Diagnostics for Physics-Informed Machine Learning” provides a crucial uncertainty-aware metric for model selection and hyperparameter optimization, even in data-free scenarios.
The road ahead involves further enhancing robustness, tackling the curse of dimensionality more comprehensively, and developing standardized metrics and benchmarks. The trend towards hybrid architectures, adaptive strategies, and robust uncertainty quantification will continue to drive PINN capabilities. As highlighted in “Physics-Informed Neural Networks and Neural Operators for Parametric PDEs: A Human-AI Collaborative Analysis” by Zhuo Zhang et al. from National University of Defense Technology, human-AI collaboration will be pivotal in navigating these complex challenges. The continued integration of domain-specific knowledge with cutting-edge machine learning promises a future where complex physical phenomena are not just simulated, but understood and predicted with unprecedented accuracy and efficiency.
Share this content:
Discover more from SciPapermill
Subscribe to get the latest posts sent to your email.
Post Comment