The counter-intuitive rise of Python in scientific computing

On the topic of Fortran competitors and alternatives, this article may be of interest: A new release for GNU Octave

To understand Octave’s subsequent development requires some awareness of another tool, called MATLAB, which was also created as a Fortran alternative. MATLAB has an interesting origin story: it was created to enable students to use Fortran’s linear algebra routines without having to write Fortran programs. Its authors later turned it into a commercial product, which became quite successful. Now this powerful (and expensive) system is in use by millions of engineers and scientists all over the world. This created a market of people doing numerical computing who became used to MATLAB’s syntax, but lost access to it after leaving their employers or institutions; beyond that, some people were also interested in free-software alternatives.

I have read here about work on a Fortran REPL, which would be similar to Octave. I think it should be as easy to call Lapack and other linear algebra libraries from Fortran as from other languages. The Fortran standard library project for linear algebra currently has procedures to create identity matrices and get the diagonal and trace of a matrix. Should solving linear equations and eigenvalue problems be added, or is that outside the scope of the standard?

I don’t think the C community has self-doubt issues. C was designed to be a higher-level assembly language for writing system software and is well-suited for that purpose. Unlike Fortran, that uses the same language name for every revision, major revisions to C result in a “new” language with a new name. So, people dissatisfied with the feature set in C will just move to C++, objective C, or C#. There is no reason to layer higher-level abstractions onto C when you already have C++.


Linear equation and eigenvalue solvers (calling LAPACK) are indeed desired. In fact @certik has already made many of these linear algebra convenience wrappers (see the linalg module in fortran-utils). The biggest hurdle IMO is finding the volunteer hours to adapt the code, write the units tests, documentation, and find a community compromise on the desired interfaces of the routines.

Just for perspective it took Scipy 16 years to reach version 1.0.0, containing roughly 600 000 lines of code. The project has 600 contributors, and ~ 100 unique contributors in each 6-month development cycle. The API consists of 1500 functions and classes covered by 13000 unit tests. The estimated development costs are “in excess of 10 million dollars”. Yet, 25 % of the LOC in Scipy are “just” old Fortran codes from netlib. (Source:

It would be interesting to compare this with the cost/price of the NAG and IMSL Fortran libraries.


The last major revision of C is the C11 standard, and the last minor revision is C17. Next version is C2x at the moment. C is a standardized language like Fortran:

C++, C#, C Objective are “children” of C, or derived from C, or inspired by.

Though it may be “YAL (Yet Another Language)”, I think D is interesting because it has some features common with Fortran (particularly whole-array operations for builtin static and dynamic arrays). I don’t know why Walter Bright (the original inventor of D) became interested in that, but it is very convenient for numerical computing (IMO).

For comparison, std::vector in C++ does not have that facility, though 3rd-party array libraries support it (though the syntax is somewhat clumsy…). I imagined that C- (or CS) oriented people are essentially not interested in such things, so I was just surprised that D supports it builtin (plus, the library called “Mir” provides Numpy-like multi-dimensional extension). Also interesting is that their standard library is based on “ranges” (an extension of iterators), which works directly on arrays and interesting to play with (just for fun :slight_smile: ) (C++17 or 20 seems to have introduced a similar range-like thing, but not sure whether it is nice or not…)

RE array slices, C# also seems to have introduced “…” symbol recently,

so I feel the impact of Numpy (or ML) is really large here… (but I guess the performance is not very high, yet).

Recently, Julia, Nim (via the Arraymancer library), and other languages also have nice and efficient support for multi-dimensional arrays (with whole-array operations), so I think it is not easy to claim that arrays are a “killer feature” of Fortran. (Nevertheless, the array syntax of some language or libraries are still clumsy (IMO), so I feel Fortran’s array is very intuitive to use).

1 Like

The issue is not which language is best, but rather which language the colleges and universities teach. CS departments generally don’t teach Fortran at all (most of the faculty do not know the language).

Out of curiosity, do you have ideas why Fortran does not become the subject in CS courses? (FYI, I also use Python in a univ course with many reasons, despite that I myself used Fortran for my programs…)

1 Like

I guess it would also be useful to check the evolution of the long-term popularity (below in the linked page), which is very similar to Ada and Lisp. (Also, there seem to be rather large statistical fluctuations, e.g., Fortran was once close to out of the ranking there (IIRC…)). But I guess FortranCon, Disocurse, and other things may have played a considerable role to increase its ranking back to the current state (just my guess…) (<-- I’m not saying that this kind of rank is the most important thing, but just how I watched that index page.)

1 Like

what do you mean with “constructive”? The title of the thread is “the counter-intuitive rise of python …”. In my posts I am trying to line out why it is actually the exact opposite … very intuitive. In that regard my posts are very constructive to the thread topic. But trying to show that the rise of python is very natural may lead to some unpleasant insights in why fortran has shrunk so much. However, it points to the, imho, still unanswered question for a beginner: what is the value proposition which set fortran appart!

With regard to the 2008-2018 standards: I have desperately awaited every little piece of it and very often it was “too little too late” especially in the oop arena. The projects I am running at the moment are only compilable with intel because gfortran doesn’t catch up, pgi is nowhere near gfortran and for nag I don’t have the money. But intel has taken weeks out of my lifetime just for tracking down ices. Just recall the deaster around the 18 release.

BTW I am under 50 as well … but I think fortran and this community still have to reach the critical mass.

1 Like

Fair point, which further erodes the point you have to make when explaining novices why to learn fortran.

Considering a course with a 50 hours volume for beginners, teaching Python allows you to address more complex problems because the syntax is simple and you can easily access powerful libraries.

But what is interesting when you teach Fortran or C is that the students must understand more deeply how a computer is working. For example, in Python you can compute with arbitrarily large integers. In Fortran or C, you need to understand that there are 8, 16, 32, 64 bits integers… with a limited range.

So teaching a “low” level or a “high” level language is different. In the first case, you must go deeper, in the second case you can “fly” higher…

I think it’s a good thing to learn at least one high level and one low level language. I don’t know if there is a best order…
All I know is that teaching any language has always been a pleasure for me…


Some (myself included) would argue that it is good for a programmer to understand how a computer works.


The CS faculty decide what they teach. If none of them knows Fortran it is unlikely they would choose to teach it. (And if FORTRAN 66 is all they know, it is probably better that they don’t teach that.) Typically, at a university Fortran classes will be taught (if at all) in the Engineering College or Meteorology department.


Yes, C, C++, and Fortran all have ISO standards. And C continues to evolve, mainly tracking changes in computer hardware. But adding a major new programming model, like OOP, to C would seem unnecessary, since C++ already covers that.


I see the “implict none” debate continues above. It is a source on continual annoyance that It have to pepper my code with implicit none and worse now implicit none (type, external). All that ‘old code’ that would break will be peppered with lots of other stuff that requires non-standard language features and specific compiler options to make it work anyway so having these guys add one more option is surely not a big deal. I am teaching someone Fortran at the moment and I found explaining implicit typing quite awkward and embarrassing…

1 Like

Why embarrassing? There is no point in feeling ashamed of some historical language developments. Personally, I would much rather adopt the stance from the book by Mark J. Lorenzo - The history of the Fortran programming language: abstracting away the machine:

But all HLLs owe a huge debt of gratitude to FORTRAN, …, which was the first mature high-level programming language to achieve widespread adoption. Many programming practices we take for granted now came about as a result of FORTRAN.

The birth of FORTRAN planted a seed that led to the full flowering of high-level languages, since FORTRAN overcame initial skepticism by demonstrating to the world that a well-made HLL really could abstract away the machine.

You might also take note of a recent tweet by Guido van Rossum:


I discovered this only recently while playing code golf: (I recommend starting with Fizzbuzz in Fortran). While I recognize that implicit typing is generally a bad idea, I was totally amazed by how short programs can get.

Well embarrassing because it makes the language look a bit naff and stuck in a time warp. I bit like when I was a kid, my dads TV remote control was to tell me to go to the TV and press one of the 4 buttons that selected channel. I feel we should have moved on in Fortran as we have in all other aspects of life. I agree the history of Fortran is amazing and I remember back in the 80s when my boss insisted on compiler options that enforced explicit typing. I though it was a real pain to declare all those variables but quickly learned how much better the code became for actually doing it.


My dirty little secret: For quick one-offs where I just want to see how something works or what it does, and I know I won’t share the code with anybody, I use implicit typing.

On a related topic, I sometimes wish Fortran had inferred typing. It would make simple programs even simpler. But of course this wouldn’t be backward compatible with implicit typing.


No! Never ever ever … :slight_smile:


I think I have never used that Fortran feature in 25 years… :face_with_monocle:

Although I am familiar with not declaring the type of variables in Python for ten years… :snake:

:space_invader: And I was also familiar with that “technique” when I learned programming at the beginning of the 80’s (BASIC). Although the string variables were suffixed by a $ sign: A$="Hello world". Perhaps one reason was that in pocket computers, you had only around 1 kB of RAM. Declaring Integer :: i would have eaten 12 bytes, around 1% of the available memory! Although, it’s perhaps not exact because I think instructions were probably not coded in ASCII in RAM but as a byte (there was only some tens instructions).

And in later Basics, like GFA Basic, integers were suffixed by a % sign. Then in Visual Basic, variables where declared with that syntax: Dim Count As Integer

1 Like