The counter-intuitive rise of Python in scientific computing

Please note these workarounds are very difficult and often impossible to implement and deploy and adopt, across constructs in the language itself such as interface .. end interface and across workflows spanning simple editors such as Notepad to Visual Studio Code to Visual Studio IDE and many other.

The standards committee has made some formerly valid code obsolescent. I like to use FORALL one-liners, but gfortran -std=f2018 xforall.f90 on

integer, parameter :: n = 3
integer            :: i,ivec(n)
j = 2
forall (i=1:n) ivec(i) = i*j



4 | forall (i=1:n) ivec(i) = i*j
  |                            1
Warning: Fortran 2018 obsolescent feature: FORALL construct at (1)

and adding the option -Werror turns that warning into an error. If Fortran 2025 (or whatever) made implicit typing obsolescent, it could be arranged that only people who compiled with -std=f2025 or the equivalent in other compilers saw the warning, and only those who compiled with std=f2025 -Werror got an error.


There is a thread open at the j3-fortran proposals page on the topic of removing implicit typing:

I can merely suggest to continue the discussion on implicit typing there.


I, of course, agree that implicit none has to go, and it is another one of those things holding back Fortran. It is literally the first thing I have to tell people when I try to introduce them to Fortran: “OK, you have to put this magic line in every file to disable 50 year old nonsense behavior”. It’s not a good look. We need to hide the nonsense, not keep it front and center for all to see (the same for implicit save). I have never seen anyone use these “features” except by accident. Every beginner has been bitten by these ridiculous features. I see it over and over and over again, year after year. If you want to use 50 year old nonsense, you need to use a compiler flag. That is not too much to ask.


My point remains about implicit none becoming the default in Fortran. Note this is entirely separate from removing implicit typing, there is a subtle but crucial difference here.

But my bigger point is there are critical gaps and deficiencies in Fortran language because of which there have been rise of many other programming languages and paradigms ever since the beginning in scientific computing, but which started to accelerate in the 1970s. Python among the latest in the top spot for scientific computing even. One may be inclined to brush the top spots off as occupied by fads, nonetheless a larger concern remains which is the slide of Fortran.

To stop the bleeding, one has to address all the wounds. Many of the issues have to do with features, or thereof, in the language standard itself.

I think the treatment of Fortran in this blog post vs. Python is rather unfair. If Bob is not a good researcher, that is not a weakness of Fortran. I highly doubt if many people (perhaps including the authors of this blog post) know much about modern Fortran syntax and capabilities. Even the savviest self-proclaimed Fortran programmers that I have seen in academia do not really know modern Fortran features beyond F90 or F2003. They are simply not aware of it.
and then there is the issue of bad Fortran education and courses in academia. I have not seen any Fortran course based on Fortran 2008 or 2018, or even 2003. The most progressive educators that I have seen teach Fortran 90. and that explains why many people do not know about modern Fortran language.


I think Fortran may have stabilized. It is ranked 32nd with 0.45% share in the December 2020 TIOBE index, compared to 30th and 0.456% share in January 2011.

Damian Rouson and myself teach Fortran 2018 in our courses, and are always looking for people who would like to learn it. If you know of anybody interested in a training session, please send them our way. or


As far as argument matching Fortran did not allow a 1x1 array, which led to a very heavy use of passing scalars as arrays; which I never saw a compiler have trouble with until very recently, so that would break a lot of code. If Fortran was done from scratch everything could be an array where the undeclared size would default to 1x1 and a lot of that code would work.

The first three compilers I checked all have a switch to imply “implicit none”. Several compilers like the Intel compiler allow for installation of a default centralized configuration file where you can change all the compiler defaults. So this is something that compiler developers have suppied an option for supporting, indicating there is indeed a need or demand for it. Perhaps vendors could address this issue by supplying links to their compilers that implied strict use of modern programming, like implying implicit none and forcing allowing no deprecated features. Something like fpm would have to still have a way to call compilers like gfortran using default behavior, but might encourage “best practices”
by having an option or default to call a compiler with as many of these switches on as the compiler supports. For example, for gfortran

 gfortran -free-form -ffree-line-length-none -coarray=single -fimplicit-none -std=f2018 -fmodule-private -Wargument-mismatch ...

F was an attempt to force “good” practice and (though I applaud it) did not take the world by storm, but perhaps the vendors are another avenue to help deprecate old practices.

Remember one of the advantages of Fortran is there is all that old code out there that lies behind a lot of “modern” utilities. I remember someone looking totally perplexed that when they wanted to build some Python packages from scratch that they needed a Fortran compiler.

So, for gfortran there could be a “gf” link for good fortran, and a “goof” link for “good old obsolescent fortran” that just use different switches. Make a couple of little bash(1) files and get everyone you can to use “gf” and see how it goes. Make sure the scripts record usage so you can see who "goof"s and help them port their code :slight_smile:


I think Fortran gets it right. A scalar is not a 1x1 array, or a 1-element vector. A scalar has rank 0, but it can be broadcast so that arithmetic operations of scalars with arrays work.


I am wondering whether the c community has the same self-doubt as the fortran community. Was there ever at thread in the c community about how c can be made as easy to use as python? This observation alone puts this whole discussion in an interesting light.

Back to thread: the rise of python is everything else than counter-intuitive. I see this in my own field of engineering. The outgoing generation used fortran, and ONLY fortran. This is because way back then the implicit typing of fortran allowed to write super-short code to get some stuff done. People wrote Mickey Mouse programs for calculating a mean or what ever … just 5 liners. Having had to do this in c would have required twenty lines of headers etc. However, this stuff you do now in r or python with next to no knowledge required about how to run a compiler or memory management … so that field went to the other languages (naturally). What remained was algorithms development. While every algorithm can be coded in python or r, often feasibility in terms of processing time required c or fortran where you could control when memory is allocated and how it is used … just for the sake of speed. We had netlib blas, but some routines were just not there. Even in 2017 I wrote a sparse rank update routine which outperformed mkl at that time. However, many of those algorithms have moved to highly specialized libraries often written in c or assembly (mkl, openblas, intel oneapi etc. etc.) … no need any longer to bother about optimizing your home brew in fortran. With the advent of mkl sparse inspector executer probably 20% of my fortran code went out of the window. Now all these highly optimized routines are callable from python … so the speed difference between a fortran program calling mkl and a python program calling mkl becomes marginal.

One top of that fortran misses all the stuff python( and c++) come along with: hash table, linked list etc. etc. house keeping stuff you don’t wanna worry about … but you have to if you are stuck with fortran.

There is actually only one pro left, which is like a knock-out criterion for the other compiler languages: the natural language support for arrays with all its features. If c++ would offer something similar (not eigen or boost or other stuff where people forget to maintain) I would have left the fortran world long time ago … after having written about 500,000 lines of code.


Hello guys.

I updated a bit the post, adding a little bit of context, and making a difference between the good-old-fortran and modern fortran, hoping this will reduce the confusion between the two

I cannot debate a lot right now, but I will definitely read all of your comments.

A Dauptain


Not C, but it seems that the D community has been frequently discussing its current status, as compared to other languages (C++, Rust, Nim, Python, etc), particularly “what the current strength (and weakness) of D”, “where to go in future”, “need for better management”, etc, etc…

D programming language popularity

Future of D

The difficult part of D seems that people from very different fields are using it for different purposes (e.g., systems programming, web and game development, and numerical computing) and so their needs are often very different from others…

Though the situation is not the same as Fortran (because it focuses on numerical computing), I feel a similar situation exists also in Fortran, i.e. some people (including me) want more support for front-end programming/facilities, while some people think they are “kitchen sink(?)” or “fads” and Fortran should focus only on back-end capabilities.


Sure, but the fact that it is simpler to rerun code is a huge benefactor for end-users too. The agility of Python, to me, boils down to the core developers dedication to serving end-users requests. This has been Guido’s response to its success (I can’t seem to find his blog-post on this, sorry for the non-backed up claim besides memory).
This is going to be very difficult for Fortran due to the standard committee and the work that needs to carefully go through for this.
However, it may be done, see e.g. C++ which has a sharp release cycle every 3 years. While this would be very welcome in the fortran world, we need compilers to keep up :wink:

LFortran is definitely a push in the right direction.

There are pitfalls in every language. What makes Python popular is also its modularity, classes and decorators. It really allows extremely versatile programs and re-use of code that is hard to get by without complicated template mechanisms in fortran (to my knowledge at least). As for 2 vs. 3, that barrier is removed couple of years down. :wink:

This I think is a minor issue, if in doubt, do the full path loading, if you can’t remember it, you build a package too complicated :wink:

The static typing, I agree, is somewhat misplaced only helping static checkers. As they wrote when they released it, “it is never intended to be used as static typing”, meaning, they will never use it for actual performance increases… :frowning:
I agree that there are weird things with mutable objects in arguments and the numpy example, but consider this ridiculously often encountered bug in a huge amount of fortran code:

complex(8) :: z
real(8) :: a
z = (1._8/3., 2./3.)
a = real(z)

I see this over and over, and for newcomers to fortran, this is just nightmarishly stupid :wink:
I really have to agree!

We could really go far with the fpm manager, and a good documentation tool. Ford is still in its infancy, not really scalable for large projects, doxygen need new f-parsers.
Such tools that enable, more or less, standardization of documentation styles for codes would also help immensely!

I think the fortran community should aim for where it may impact a lot, high-performance tools with swifter developments than C/C++ and less error prone for scientists. There are many pitfalls in C/C++ in terms of HPC data-structure usage. And debugging fortran code is just much easier :wink:


Great discussion, keep it going.

I have a question:

Is the issue that it is assigning a single precision to double precision? So the correct way to write it would be:

complex(dp) :: z
real(dp) :: a
z = (1._dp/3, 2._dp/3)
a = real(z, dp)

Or is there another problem? If it’s just the _dp, then that can be fixed by the compiler giving an appropriate warning, which is what we plan for LFortran.

1 Like

i wouldn’t agree that the d situation is similar to that of fortran

… admittedly all my d language knowledge is from reading wikipedia, but it appears that d is “yet another language” (YAL). and with yal it is the same as with all stuff in life, some stuff gets traction and others doesn’t, and the random part in that process is BIG.

however, fortran was once a giant, and there is usually no randomness involved when a giant shrinks to a dwarf … there are clear reasons.

imho fortran was the R of the 70ties to 90ties … with next to no knowledge about hardware and compilers you had a reasonably fast home brew to get some stuff done (eg. calculate a mean). And it was difficult to f** it up, thanks to bounds checking etc. Trying to achieve the same in C would have required you to undergo an ordeal. fortran offered(s) the option to make stuff superfast, but for that the learning curve is almost similar to that of c … but you still don’t have the power of c (e.g. the infamous “realloc” which backs the whole “push_back” functionality).

Now home brew has gone out of the window (by and large), thanks to mkl and co, calculation of means has moved to R etc. For fortran becoming THE hpc language the access to the os and the memory management too restricted. And “there is a lot of old code written in fortran” is imho not a value proposition for beginners to learn fortran.

In essence in the context of the above developments I cannot see the question “who are we” sufficiently answered to eventually survive and gain traction.

And making fortran looking like python … fython will just be a new kid on the block, with, similar to d, all the randomness involved when trying to gain traction.

this boils down to that when phd students and postdocs ask for an advice on what to learn I usually respond with “Python and C++”.

just my 5 cents.

1 Like

@pmk, good point. I would like compilers to warn and recommend to use cmplx and z%re.

In the TIOBE Programming Community index (TIOBE Index - TIOBE), C is the most popular language, Python the third, Fortran 32th.
C was invented to write operating systems (UNIX). And with embedded devices and Internet Of Things, it has probably a bright future.

That’s what the point is:

c/c++ has it’s place in op programming AND hpc for professionals (interfaces to newer mkl facilities exhibit that they are also written in c or assembly).

fortran HAD its place in hpc for beginners and professionals … where the beginners didn’t not regrow because python came along, and the professionals have grown old.


IMO this is not constructive. Fortran is still one of the major HPC languages, and I am pleasantly surprised at how quickly Fortran 2018 features have been added to compilers such as Intel and gfortran. It’s my impression that many contributors here are below age 40 (I am 50.)