An evaluation of risks associated with relying on Fortran for mission critical codes for the next 15 years

Very well said. I hold exactly the same opinion, which is pretty much criticized here. See the discussions under the following old threads:

I am repeatedly told that I am using Fortran and the compilers in the wrong way, it is insensible to expect intrinsic procedures to be performant out of the box, and it is fine to have SEGFAULT as the default behavior.

Nevertheless, I still believe that the built-in support for automatic arrays and array manipulation (addition, multiplication, slicing, broadcasting, …) is one of the core strengths of Fortran. Array manipulation takes a high weight in scientific computing. If this strength is well-developed and exploited, Fortran can become the lingua franca of scientific computing (again), and an ideal language for templating numerical algorithms.

Still, I am determined to use in my PRIMA project (a package for solving general nonlinear optimization problems without using derivatives) only automatic arrays and “matrix-vector procedures” instead of loops whenever possible. Because, according to my very humble and limited understanding of pure and computational mathematics, this is the correct way of presenting / coding numerical algorithms, and this is the future. (Thinking about ChatGPT, you will realize that the boundary between presenting and coding is blurred.)