Physics-Informed Neural Networks: A New Frontier in Scientific Discovery and Engineering
Latest 50 papers on physics-informed neural networks: Oct. 20, 2025
Physics-Informed Neural Networks (PINNs) are rapidly transforming how we approach complex scientific and engineering problems. By embedding the fundamental laws of physics directly into neural network architectures, PINNs promise to deliver more robust, interpretable, and data-efficient models. This burgeoning field is seeing an explosion of innovation, addressing critical challenges from enhancing PDE solvers and refining scientific discovery to improving real-world applications in medicine, climate science, and advanced engineering.
The Big Idea(s) & Core Innovations
The core innovation across recent research lies in making PINNs more accurate, efficient, and applicable to a wider array of real-world scenarios. A significant theme is the hybridization of PINNs with classical numerical methods and advanced network architectures. For instance, Hélène Barucq et al. from Université de Strasbourg, CNRS, and Inria in their paper, “Enriching continuous Lagrange finite element approximation spaces using neural networks”, demonstrate that enriching Finite Element Method (FEM) spaces with PINN predictions leads to faster and more accurate PDE solutions, especially for parametric problems. This synergistic approach allows for coarser meshes, drastically cutting computational time.
Another major thrust is improving PINN training stability and efficiency. The paper “AutoBalance: An Automatic Balancing Framework for Training Physics-Informed Neural Networks” by Kang An et al. from Rice University and The Chinese University of Hong Kong, Shenzhen, introduces a novel ‘post-combine’ approach that uses independent optimizers per loss component, effectively mitigating gradient conflicts and enhancing stability. Similarly, Sifan Wang et al. from Yale University and the University of Pennsylvania in “Gradient Alignment in Physics-informed Neural Networks: A Second-Order Optimization Perspective” show that second-order optimizers like SOAP can significantly reduce directional gradient conflicts, leading to state-of-the-art results on challenging PDE benchmarks, including turbulent flows. Further pushing the boundaries of efficiency, Chinmay Datar et al. from the Technical University of Munich and Delft University of Technology present “Fast training of accurate physics-informed neural networks without gradient descent”, introducing Frozen-PINN, a method that achieves up to 100,000x faster training times without gradient descent or GPUs, by leveraging space-time separation and random features.
Generalization and interpretability are also key areas of focus. Matteo Scialpi et al. from Università di Ferrara in “APRIL: Auxiliary Physically-Redundant Information in Loss – A physics-informed framework for parameter estimation with a gravitational-wave case study” enhance parameter estimation by incorporating known physical relations into the loss function, demonstrating significant accuracy improvements for gravitational wave analysis. For scientific discovery, Yunpeng Gong et al. from Xiamen University propose “StruSR: Structure-Aware Symbolic Regression with Physics-Informed Taylor Guidance”, which uses PINN-derived Taylor expansions to guide symbolic regression, yielding interpretable and physically consistent mathematical expressions.
In the realm of real-world applications, PINNs are showing remarkable versatility. Kayode Olumoyin and Katarzyna Rejniak from H. Lee Moffitt Cancer Center utilize PINNs to model “Modeling Adoptive Cell Therapy in Bladder Cancer from Sparse Biological Data using PINNs”, effectively capturing unmodeled effects from limited biological data. Similarly, P. Rothenbeck et al. from the University of Cologne apply PINNs to “Modeling COVID-19 Dynamics in German States Using Physics-Informed Neural Networks”, estimating epidemiological parameters and revealing the impact of vaccination and regional policies. The emergence of AI-driven automation for PINN design is also noteworthy, with **Xin He et al. from A*STAR and Hong Kong Baptist University** presenting “Lang-PINN: From Language to Physics-Informed Neural Networks via a Multi-Agent Framework”, an LLM-driven system that automates PINN generation from natural language descriptions.
Under the Hood: Models, Datasets, & Benchmarks
Recent advancements in PINNs are underpinned by innovative models, specialized datasets, and rigorous benchmarking protocols:
- Hybrid Frameworks: The integration of PINNs with other methods, such as Finite Element Methods (FEM) as seen in “Enriching continuous Lagrange finite element approximation spaces using neural networks” or DeepONet in Kamaljyoti Nath et al.’s “A Digital Twin for Diesel Engines: Operator-infused Physics-Informed Neural Networks with Transfer Learning for Engine Health Monitoring”, creates powerful hybrid solvers. This hybrid PINN-DeepONet framework, coupled with multi-stage and few-shot transfer learning, significantly boosts efficiency for real-time applications.
- Novel Architectures: Papers introduce new architectures tailored for specific challenges. Kürsat Tekbıyık and Anil Gurses from Bilkent University propose PIKAN in “PIKAN: Physics-Inspired Kolmogorov-Arnold Networks for Explainable UAV Channel Modelling”, combining Kolmogorov-Arnold Networks with physical principles for explainable UAV channel modeling. Similarly, Junyi Wu and Guang Lin from Purdue University developed PO-CKAN in “PO-CKAN: Physics Informed Deep Operator Kolmogorov Arnold Networks with Chunk Rational Structure”, which uses rational KAN modules to efficiently solve PDEs, reducing L2 error by ~48% on Burgers’ equation compared to PI-DeepONet.
- Adaptive Strategies: Dynamic adjustment mechanisms are crucial for complex problems. “AB-PINNS: Adaptive-Basis Physics-Informed Neural Networks for Residual-Driven Domain Decomposition” by Jonah Botvinick-Greenhouse et al. from Cornell University and MERL dynamically adapts subdomains based on solution features, improving convergence for multiscale PDEs. Coen Visser et al. from Delft University of Technology’s PACMANN in “PACMANN: Point Adaptive Collocation Method for Artificial Neural Networks” adaptively moves collocation points based on residual gradients, yielding state-of-the-art accuracy/efficiency tradeoffs in high-dimensional settings. For self-adaptive weighting, Wenqian Chen et al. from Pacific Northwest National Laboratory introduce a method based on “Self-adaptive weights based on balanced residual decay rate for physics-informed neural networks and deep operator networks”, addressing the problem of uneven residual decay.
- Specialized Models: “SeqBattNet: A Discrete-State Physics-Informed Neural Network with Aging Adaptation for Battery Modeling” by Khoa Tran et al. from AIWARE Limited Company utilizes a discrete-state PINN for accurate battery voltage prediction with aging adaptation. For quantum systems, Antonin Sulc from Lawrence Berkeley National Lab in “Quantum Noise Tomography with Physics-Informed Neural Networks” leverages the Lindblad master equation to characterize quantum noise from sparse data.
- Robustness Benchmarking: Several papers focus on rigorous evaluation. Jiakang Chen from University College London (UCL) provides a “Neural PDE Solvers with Physics Constraints: A Comparative Study of PINNs, DRM, and WANs”, comparing PINNs, Deep Ritz Method (DRM), and Weak Adversarial Networks (WANs) on Poisson and Schrödinger equations. O. Rincón-Cárdeno et al. from Universidad EAFIT conducts a “Comparative Analysis of Wave Scattering Numerical Modeling Using the Boundary Element Method and Physics-Informed Neural Networks” for Helmholtz equations, highlighting the trade-offs between PINNs and BEM.
Several code repositories are publicly available, encouraging further exploration: APRIL, Neural-Network-Based-PDE-Solver, Augmented-data-and-neural-networks-for-epidemic-forecasting, mass_conservation_on_rails, Neuroevolution-of-PINNs, GatedX-TFC, NLS-SVMs, differentiable-soundfield-reconstruction, PIELM-for-Option-Pricing-66CB, RF_PDE, PACMANN, LocalCP4PINN, pinn_adaptive_weighting, ET-PINN, deepxde, glucose-monitoring-pinns, and PINNs-Based-MPC-for-SIR-Epidemics.
Impact & The Road Ahead
The impact of these advancements is profound, offering a future where complex scientific problems can be tackled with unprecedented accuracy and efficiency. By bridging the gap between data-driven machine learning and established physical laws, PINNs are enabling interpretable AI for scientific discovery, allowing researchers to not only predict but also understand underlying mechanisms. This is particularly evident in “Unified Spatiotemopral Physics-Informed Learning (USPIL): A Framework for Modeling Complex Predator-Prey Dynamics” by Julian Evan Chrisnanto et al. from Universitas Padjadjaran, which provides a unified solution for ODEs and PDEs in ecological modeling with significant speedups and mechanistic insights.
In practical applications, these innovations are paving the way for advanced digital twins, real-time control systems, and robust medical diagnostics. The “Data-Driven Adaptive PID Control Based on Physics-Informed Neural Networks” by Author A and B from Institution X and Y illustrates improved adaptive PID control for dynamic environments, while Riyaadh Gani from University College London explores “Physics-Informed Neural Networks vs. Physics Models for Non-Invasive Glucose Monitoring”, emphasizing the power of physics-engineered features.
Future directions include further enhancing robustness to noisy data, as highlighted by Aleksandra Jekica et al. in “Examining the robustness of Physics-Informed Neural Networks to noise for Inverse Problems”, and improving uncertainty quantification with frameworks like the “A Conformal Prediction Framework for Uncertainty Quantification in Physics-Informed Neural Networks” by Yifan Yu et al. from National University of Singapore. The potential for “Evolutionary Optimization of Physics-Informed Neural Networks: Evo-PINN Frontiers and Opportunities” is immense, promising automated architecture and hyperparameter tuning to overcome current training limitations. Furthermore, integrating PINNs with large language models, as demonstrated by Lang-PINN, signifies a major step towards automating scientific modeling workflows. The burgeoning field of “Physics-Informed Machine Learning in Biomedical Science and Engineering”, reviewed by Nazanin Ahmadi et al. from Brown University, points to critical applications in biomechanics, pharmacokinetics, and medical imaging.
These papers collectively paint a picture of a vibrant, rapidly evolving field where physics and AI are converging to unlock solutions to some of humanity’s most challenging problems. The journey towards fully autonomous and universally applicable physics-informed AI is still ongoing, but these recent breakthroughs represent significant strides forward, promising a future where scientific discovery is accelerated and engineered solutions are smarter and more reliable than ever before.
Post Comment