This file has been truncated. show original
# What does Fortran need in 2020 to survive?
I asked this question for myself at some late evening on Sunday during multiple half-successful attempts of converting and installing my collection of useful Fortran code snippets into the visual studio code editor. I have written a small converter code in D language for this purpose. Usually, I take Python for similar problems, but I took D for this task just to test the usability of VScode / JetBrains plugin for the D language. Both work, even though still with many small quirks. I decided to learn one programming language per year and last year the choice has fallen on D, so that was my motivation to prove the fact that I still have the required knowledge. D is a good example of another sad story in the programming language world. Despite multiple attempts to gain recognition and rather a good ecosystem, the popularity of this language is in extended stagnation. The most important reason for this situation might be a strong concurrence with C++. It became especially elucidated after the introduction of the C++20 standard. But I want to talk about Fortran today. To start with the theme, unfortunately, I am no more the Fortran developer for the last 5 years. But I had been an active Fortran developer since 2001 and until 2015. Starting with FORTRAN77 I have been using FORTRAN90, Fortran 2003, and beyond Fortran 2008 with a long set of compilers (g77, g95, gfortran, ifort, pgfortran, nagfor etc).
My tasks for Fortran were always very plain and specific: large scale computations for theoretical atomic physics. These are fields where Fortran is still in good health: matrix operations, solutions of ODE/PDEs, parallelized math operations with MPI, OpenMP, or using PGI (now NVIDIA) compiler for GPU CUDA accelerated computing.
I produced my codes in Fortran90-2008 routinely without making any efforts until the year 2015 when failed to make a usable program. Bad signs were already visible at the late beginning, during the prototyping phase, but I was a real stubborn donkey and didn't want to accept this moment of truth. I do not want to come into detail, but the task was to develop a complex multiscale physical model. And for that kind of problem, I needed advanced data structures (tree, hashmap, linked lists), and containers of combined complex data structures. Also, a few design patterns were needed (prototype, fabric, state, strategy) too. Nothing of that existed in Fortran. There is still no book in Fortran for OOP or with examples for design patterns in Fortran. I spent almost 2 weeks on developing&testing a simple hashmap in Fortran and another 2 or 3 for the basic red-black tree. Then I implemented 3 of 5 needed design patterns and started with prototyping of the program. I spent more than 3 months in total just developing different wheels in Fortran. In the end, a Ph.D. student finished the prototype of a program with the same functionality in a week using D language. That was the strongest fall of my vision and trust in Fortran and I have got complete enlightenment. I recognized that the language is very limited and is fallen in this use case as in the Fall of Carthage. No future for Fortran here. Period. The 2018-standard just confirmed my belief. No changes came in this swamp, just stagnation...
So here is my short and incomplete list of Fortran problems:
##### Problems of the Fortran language itself
* Missing standard library. There is no choice. Only very basic libraries of questionable quality at Github almost without any kind of internal unit testing. Continuous development of wheels needed for advanced data structures.
* Missing built-in unit testing. There are some external modules for that purpose but if we still are positioning Fortran as a formula translator, we need unit testing directly built-in in the language.
* Missing exception handling. Program in Fortran cannot survive any kind of serious exception. For critical software, this is unacceptable freedom.
* Missing unsigned integers / bit strings for cryptography and networking. This fact significantly complicates the writing of interfaces (another sad story) to existing C-libraries.