Fortran Misconceptions

Found this today. I agree with some of this and disagree with other parts. For example to think that GPU offloading will be a weakness for Fortran is a bad perception.

The real problem is the schools that do not teach Fortran because they are so biased toward Microsoft and C# businees applications where the default employment goto for “IT” folks involves C#. Hard to fight the Microsoft gorilla.

1 Like

C# (or Objective C, or Swift… all platform dependant environnements) are not Fortran direct competitors. The competitors are rather C++, Julia, Python…


Not direct competitors, true, but I know of one large legacy Fortran application being re-written to C# because of the abundance of C# programmers. While Fortran’s main application domain may be the most important to maintain a foothold on, adjacent domains are also important.

They may be competitors… at school, as the number of languages students can learn during their studies is limited.
In a perfect market system, the wages of Fortran developers should increase, as they become rarer. And it should attract more people. But many other factors come into play…

I agree with this sentiment. Additionally, one may consider GPU offload a weakness of Fortran when compared to something like Python for instance that will automatically parallelize (on CPU and/or GPU) many commonly used libraries such as tensorflow, pytorch, and numpy.

Additionally, my understanding is that the original appeal of FORTRAN was: “it’s easier than assembly, but can be just as (or even more) performant!” It was true in the 1950s, but is not true, or at least not sufficient, today. The above scripting languages (and even things like R or MATLAB) offer comparable, or in some cases strictly superior performance to Fortran intrinsic procedures for common matrix-vector operations. Sure, someone can write Fortran that will compile to something as performant as what MATLAB or Python with numpy gives you out of the box, but that is NOT as easy to write as the alternatives available today. If the last bastion of Fortran is maintaining legacy codes and that we can compile do-loops to efficient assembler, then the language outlook is pretty bleak.

1 Like

The voting members and other influential members on J3 and WG5 mostly appear to disagree with this and moreover see everything as cost only, and vehemently dismiss most aspects, barring a few trivial enhancements, as not to be pursued that can otherwise make Fortran better for adjacent domains or go so slow to end up with too little too late of a half-baked product.

Hope things change with Fortran for the better soon.

What do you think is the best way to remedy this situation? Personally, I want Fortran to live on, and grow as a language. The attitude you suggest governs the standard is really disastrous to that end. A future where Fortran is relegated to maintaining HPC codes from the previous century would be very sad.

For Fortran to not only survive, but grow as a language, people should be excited and motivated to start new projects using it. But the barriers there are high. Limited ability to interact with the system, and absolutely no ability to display beyond the console are massive blockers to new, useful projects using Fortran as the primary language.

As “the first high level language,” Fortran has almost no high level language features a new user expects. If the desire is to stay like that, then we should instead aim for lower level features to be added, giving the user more control and the ability to easily, fully utilize modern hardware without relying on compiler optimization that may never come. Fortran has excellent array handling capabilities, but is nonexistent in the graphics, game engine, machine learning, and data analysis domains, all of which place heavy emphasis on performance.

New code seeking performance is predominantly C++ from what I see.

Here we go again…

For several months, I’ve been developing a close-source (sorry :crazy_face:) artificial intelligence project written from scratch in pure Fortran, which I want to use for business purposes (hopefully, my business plan will work but this whole project is a crazy experiment I’ve dedicated months of my life with no guarantee of success yet :grin:).

In essence, it’s a neural network with some recurrent elements analyzing large sets of numbers. Months ago I started the development in Rust, the programming language I feel most comfortable with :sparkling_heart:. But early I ran into performance issues running the neural network on my laptop. Experimentally, I had rewritten the neural network in Fortran (ChatGPT sped up the translation of some code pieces despite lots of mistakes) and I found the Fortran implementation to be several times more performant than the Rust code (especially, when compiled with the -Ofast -march=native flags).

Possibly, there exists some way to optimize the Rust code or compilation process I have not fully explored. But there’s still another reason why I’ll stick with Fortran. While writing a “classical” algorithm if else then I’d rather employ a language like Rust, which is very well-structured. I feel that the development of my neural network was in many regards different from writing a classical program, because rather than on planning, it was based on experimentation, on testing of hundreds of (almost random) neural network architectures and settings seeking for the one which some miraculous way (I don’t fully understand) gives the right output. I’d describe it as a natural evolution via thousands of experiments failing to work 99.9% of the time rather than as an intelligent design of a classical algorithm.

So, rather than a nicely-structured code (like that in Rust), what I needed was an easily-edited code (like that in Fortran). Folks, if you want to do artificial intelligence think of (1) performance and (2) editability of your code. Fortran offers a solution which is both highly-performant and well-editable. It is perfect for AI. :hugs:


This is inspirational. I am also planning to write a neutral network program from scratch in Fortran. It seems that you are doing hyperparameter tuning in Fortran.

1 Like