What are the advantages of Julia over Fortran?

Thank you @oscardssmith ! I appreciate your elaboration.
The auto differentiation you mentioned in Julia looks particularly interesting.
I guess if I have the money I will invest in exploring auto diff in Fortran :slight_smile:

If I remember correctly, some of the Fortran code written by @jacobwilliams can indeed call some Python stuff from Fortran in some way. E.g.,

I think similar techniques can be used in calling R and Julia from Fortran in some way.

Besides Intel, Nvidia, I think AMD have their version of C/C++/Fortran compiler too (but perhaps not for GPU),

I think Intelโ€™s strong will in GPU market, plus the name and purpose of the free Intel OneAPI and their new IFX, perhaps have already making Fortran running on GPU available.
There is a thread here too about Fortran + GPU programming,

I can be wrong but it seems inevitable that each of the whales like Intel, AMD, Nvidia will keep developing their own distribution of Fortran compiler to particularly optimize for their own hardware. But anyway, for some reason they just do not give up Fortran :slight_smile: I guess they have their reasons, perhaps Fortran is relatively easy to be optimized for hardware from compiler level.

Sure, I agree that someone could please split it to a new thread.