Why I Believe Physics-First AI Is the Future of Atmospheric Prediction
There is a deeply held assumption in the world of atmospheric science and aviation that useful turbulence prediction requires enormous computational power.
2/20/20263 min read
There is a deeply held assumption in the world of atmospheric science and aviation that useful turbulence prediction requires enormous computational power — supercomputer clusters, terabytes of data, multi-hour processing cycles. I think that assumption is wrong, and I have spent the past stretch of my research career proving it.
Lately, we built PSTNet — a physics-structured neural network with 552 trainable parameters that can take a single NASA weather observation and turn it into a full multi-altitude turbulence intensity map in under seven seconds on a regular laptop. Our work was recently featured in The Hudson Weekly, and the response since has reinforced my belief that the industry is ready for a different way of thinking about this problem.
The Obsession with Scale Is Holding Us Back
Everywhere I look in AI research, the conversation is dominated by scale. More parameters. More data. More GPUs. And yes, for certain problems — language modeling, image generation — that brute-force approach has delivered extraordinary results. But atmospheric modelling is not one of those problems, at least not in the way most people think (and I have worked on multiple projects proving it).
We already understand turbulence. The Richardson number, mountain wave dynamics, vertical wind shear — these are not mysteries. Decades of atmospheric science have given us robust theoretical frameworks. The question I kept asking myself was: why are we ignoring all of that knowledge and asking a neural network to rediscover it from scratch with millions of parameters?
That question is what led to PSTNet's core design philosophy. Instead of building a large model and hoping it learns the physics, we structured the architecture around what we already know about the atmosphere. The network is deliberately tiny — 552 parameters — because its job is not to learn atmospheric physics from zero. Its job is to learn the small residual corrections that pure theory cannot capture. The constraints are the intelligence. The network just polishes the edges.
Clear-Air Turbulence Is Getting Worse — and Current Systems Cannot Keep Up
This work is not academic for me. Clear-air turbulence is invisible to radar. Pilots cannot see it. Passengers cannot brace for it. And it is getting worse — climate research has linked increasing turbulence over major flight corridors to strengthening wind shear in the jet stream. Injuries are rising. Airlines are feeling it.
The current operational systems that exist to forecast turbulence rely on numerical weather prediction models that take hours to update and run on infrastructure that costs millions. That is fine for scheduled forecast cycles at major meteorological centers. But it does nothing for a drone operator in a remote area, a dispatcher who needs an answer now, or a military unit planning a trajectory from a forward position with no connectivity.
We wanted to build something that works in those situations. PSTNet needs one weather API call. After that, it runs fully offline. On a laptop. On an edge device. On a ship. That was a non-negotiable design requirement from day one.
What I Think Matters Most
When we demonstrated PSTNet live over the Central Asia–Himalaya corridor, it correctly picked up severe cells at tropopause altitudes, terrain effects, and the turbulence minimum in the lower stratosphere. That meant more to me than any benchmark number because it highlighted the physics-first approach is sound.
Yet I want to be honest about where we are. PSTNet is still a research demonstration, not a certified operational product. The next step — and the hardest one — is systematic validation against pilot reports and onboard eddy dissipation rate measurements from commercial aircraft. That is the standard the aviation industry rightly demands, and we intend to meet it. We are also working toward continuous nowcasting, because a single snapshot is useful but a constantly updating turbulence picture would be transformational.
A Broader Point About AI
I think PSTNet says something important beyond aviation. The AI field is in a phase where bigger is almost always assumed to be better. I understand why — large models have delivered stunning results in many domains. But I believe there is an enormous class of scientific and engineering problems where the smarter path is to encode domain knowledge into the architecture and keep the learnable component as small as possible.
When you do that, you get models that are interpretable, fast, cheap to run, and — critically — physically constrained so they cannot hallucinate impossible outputs. In atmospheric science, a model that predicts turbulence where turbulence cannot physically exist is worse than useless. A physics-structured architecture makes that kind of failure nearly impossible by design.
I am not saying large models have no place. I am saying that for problems where we already have strong theoretical foundations, the future belongs to hybrid approaches that respect the science and use machine learning only where it is genuinely needed.
That is what PSTNet represents to me — not just a turbulence predictor, but a proof of concept for a different philosophy of AI. And I believe that philosophy will outlast the current era of scale-obsessed thinking.