Global Ocean Modeling With GPU Acceleration in Python

I think we are all in agreement now: Fortran has cross-platform syntax (in a sense that it doesn’t have anything intrinsic in the language that prevents it to run on a GPU), but it does not have a cross-platform tooling to take advantage of this. (Yes, for NVIDIA GPUs you can use Cuda Fortran and their compilers, and their compilers can sometimes also parallelize do concurrent and other intrinsic Fortran features, but that is not a cross-platform solution, at least not yet.)

2 Likes