Week 11 · May 2026

Physics-Informed Machine Learning: When Biomedical Models Need Both Data and Physical Laws

May 9, 2026 · by Satish K C 8 min read
Deep Learning Efficiency NLP Optimization

The Paper

"Physics-Informed Machine Learning in Biomedical Science and Engineering" was published in May 2026 by Nazanin Ahmadi, Qianying Cao, Jay D. Humphrey, and George Em Karniadakis from Brown University (Center for Biomedical Engineering and Division of Applied Mathematics) and Yale University (Department of Biomedical Engineering), in the Annual Review of Biomedical Engineering, Volume 28, pages 309-336. The review argues that physics-informed machine learning (PIML) is emerging as a transformative paradigm for modeling complex biomedical systems by integrating parameterized physical laws with data-driven methods - specifically targeting scenarios where physical interpretability, data scarcity, or system complexity make conventional black-box learning insufficient.

Read the Paper →

The Problem Before This Paper

Biomedical systems present a unique modeling challenge: they are governed by well-understood physical principles (continuum mechanics, fluid dynamics, reaction-diffusion kinetics) yet produce sparse, noisy, and expensive-to-collect experimental data. Pure data-driven approaches - standard neural networks, random forests, gradient boosting - require large labeled datasets that rarely exist in clinical or experimental settings and produce black-box predictions that clinicians and biologists cannot trust or interpret. Traditional physics-based simulation (finite element methods, computational fluid dynamics) is interpretable but computationally expensive, often requiring hours or days per patient-specific geometry, and struggles to incorporate real-time sensor data or adapt to patient variability. The gap between these paradigms left practitioners choosing between models that are physically faithful but computationally intractable, or models that are fast but data-hungry and opaque - with no principled way to combine the physical constraints they know with the sparse measurements they have.

What They Built

This review synthesizes three core PIML frameworks and maps their biomedical applications. Physics-Informed Neural Networks (PINNs) embed governing PDEs/ODEs directly into the neural network loss function - the network is penalized not only for mismatching data but also for violating physical laws at collocation points sampled throughout the domain. This allows training with extremely sparse observations because the physics constraints provide an implicit regularizer. Neural Ordinary Differential Equations (NODEs) parameterize the right-hand side of an ODE system with a neural network, enabling continuous-time modeling of physiological dynamics, pharmacokinetics, and cell signaling pathways where discrete time-stepping loses critical transient behavior. Neural Operators (NOs) - including DeepONet and Fourier Neural Operators - learn mappings between infinite-dimensional function spaces, enabling real-time solution of parametric PDEs across varying geometries, boundary conditions, and material properties without retraining.

// PINN Loss Function:
L = L_data + lambda * L_physics
L_data = (1/N) * sum_i ||u_nn(x_i) - u_obs(x_i)||^2
L_physics = (1/M) * sum_j ||F[u_nn](x_j)||^2
where F[u] = 0 is the governing PDE

// Neural ODE (continuous dynamics):
dz/dt = f_theta(z, t)
z(t_1) = z(t_0) + integral_{t_0}^{t_1} f_theta(z(t), t) dt
// Solved via adaptive ODE solvers (Dormand-Prince, etc.)

PINNs PDEs embedded in loss function; trains with sparse data Hemodynamics, tissue mechanics, imaging
Neural ODEs Continuous-time dynamics; adaptive time-stepping Pharmacokinetics, cell signaling, cardiac
Neural Operators Function-to-function maps; instant parametric solutions Multiscale simulation, patient-specific geometry

Key Findings

What the review establishes

Results

The review synthesizes results across 150 cited works spanning five biomedical domains. In biofluid mechanics, PINNs reconstruct 3D velocity fields from sparse 4D-flow MRI measurements with less than 5% relative error while enforcing Navier-Stokes constraints - matching CFD accuracy without patient-specific mesh generation. For cardiovascular modeling, Neural ODEs trained on clinical time-series data predict left ventricular pressure-volume loops with correlation coefficients exceeding 0.95 using only 10-20 heartbeat cycles. In mechanobiology, physics-informed approaches identify nonlinear constitutive parameters of arterial tissue from indentation experiments at accuracy comparable to traditional inverse FEM methods but at 100x lower computational cost. Neural Operators trained on parametric biomechanics problems (varying geometry, loading, material properties) achieve relative L2 errors below 1% while delivering inference in under 10 milliseconds per case - enabling real-time surgical planning applications. In medical imaging, PINNs have been applied to accelerated MRI reconstruction, CT dose reduction, and ultrasound elastography, consistently achieving diagnostic-quality reconstructions from 4-8x undersampled acquisitions.

Why This Matters for AI and Automation

My Take

Karniadakis is one of the original architects of PINNs, so this review carries both authority and inherent bias toward the framework's strengths. That said, the synthesis is genuinely useful because it maps three distinct PIML paradigms to specific biomedical problem types - PINNs for inverse problems with sparse data, NODEs for temporal dynamics, NOs for parametric real-time inference - rather than treating physics-informed ML as a monolithic approach. The most honest contribution is the explicit acknowledgment that uncertainty quantification remains unsolved: without calibrated confidence intervals, no PIML model belongs in clinical decision loops where incorrect predictions have life-or-death consequences. The LLM integration direction is speculative but interesting - the idea that a language model could parse a radiology report and automatically set boundary conditions for a PINN solver would collapse the workflow from "expert sets up simulation" to "report arrives, prediction follows." The missing piece in this review is computational cost at training time: PINNs are notoriously difficult to train (loss landscape issues, spectral bias), and the review glosses over how many GPU-hours the cited results actually required.

Discussion question: PIML achieves data efficiency by embedding known physics - but biological systems are full of unknown or partially-understood mechanisms. At what point does encoding incomplete physics into the model hurt more than it helps, and how should practitioners decide when to trust the physics prior versus letting the data speak for itself?

Share this discussion

← Back to all papers