The modeling system in the linked article is a high-fidelity numerical simulation of the coupled Earth system. It's a giant PDE solver for Navier-Stokes applied to the Earth's atmospheres and oceans, coupled together with a great deal of additional physics simulation. The intent is to reproduce, in simulation, the Earth's atmospheric and oceans with the highest fidelity. This set of simulations is the culmination of nearly 70 years of investment, going back to the very first applications of digital computers for solving complex math equations (one of the first simulations bought for ENIAC was a crude quasi-geostrophic atmospheric mode / weather forecast).
NVIDIA's FourCastNet, while very cool, is quite literally a facsimile of this type of system. It's really not even in the same ballpark.
It’s an example of a surrogate model. It’s an ML model trained on the output of large numerical simulations like the OP, rather doing the simulation itself.
Surrogate models are nice because they can emulate the output of the full fidelity calculation in a fraction of the runtime, but they typically are trained within a range of validity outside of which they cannot reliably extrapolate.
https://resources.nvidia.com/en-us-fleet-command/watch-27?xs...
How do these two compare?