With the advent of frameworks like Electron, Wirth’s law is turning out to be very accurate! I feel I spend a significant amount of my daily working hours getting annoyed at how slow MS Teams is…
software is getting slower more rapidly than hardware is becoming faster.
Indeed, it is often encouraged in software development not to worry about this. Sayings often quoted are “developer time is more expensive than computer time”, “just wait five years and computers will be fast enough”, “make it work, make it right, make it fast” and then developers run out of time for that last step.
Having to wait a long time between program runs reduces developer productivity. My Python programs were taking a few seconds to run but now take a few minutes as functionality has been added. I will try replacing some pandas code with NumPy, but if that does not work I may rewrite the time-consuming part in Fortran, perhaps aided by Pyccel.
A few comments on this discussion. There seems to be conflation between brevity afforded by a library vs the language itself. It is always possible to write methods that do several things at once for the user and hide the complexity under the hood. Python benefits from a large developer base to make a lot of these interfaces.
A few comments on why Python has succeeded now:
computers are faster so byte compiled languages can be productive even if relatively slow
leveraging existing code:nearly the entirety of numerical methods in Python is an interface overlayed on Fortran (and increasingly c) libraries. Matplotlib and numpy designs owe a huge debt to matlab.
interactivity. Always easier to build code with the faster dev cycle in non compiled languages. See BASIC, IDL, matlab, r etc. As computers get faster the reach of these languages increases.
Re: plotting. For compiled languages my preference is generally to separate model and view but that’s partly because my codes tend to run -hours x 10000s cores in batch systems. But there are many libraries for Fortran and other languages such as plplot, Giza (pgplot like interface) etc.
Arrays are part of the Fortran language. The standards committee is very reluctant to make breaking changes, and in the rare cases they do, compilers continue to compile the non-standard code with options such as gfortran’s std=legacy. The base of scientific computing in Python is the NumPy array library, which is not part of the Python language, and which is still changing in incompatible ways. Even if the changes don’t break the code you write, they make break some libraries you use.
This backwards incompatible change of the fundamental package for scientific computing with Python will certainly be annoying for many developers and users, but I’m pretty sure that will not stop the rise of Python in scientific computing. If this prophecy is correct, NumPy could serve as a role model for Fortran and backwards incompatible changes will be seen as a chance to correct mistakes rather than the fastest way to kill Fortran.
One aspect that has not been discussed so far is the support for array syntax: I find it super annoying that Fortran uses the same round brackets for function calls and access to array elements. This is IMHO a big design glitch and it’s one aspect where Python+NumPy is much better.
I’m not surprised that the reason for this is historic, but I’m also not surprised that people prefer to start projects with languages that don’t have limitations caused by long-forgotten computer architectures.
If python users enjoy spending time updating their codebases, good for them. Personally I enjoy not having to do so with my Fortran codebases.
Most compilers support flags like -std=f2018 so there is no need to update a codebase if new features are not needed. But the expectation that code from the 80s compiles with std=f2023 without changes is IMHO irrational and one of the reasons why parts of Fortran are so archaic. Rust has editions to establish “Stability without stagnation” (What are editions? - The Rust Edition Guide) and I don’t see why Fortran developers should not be able to work with well-documented and communicated breaking changes.
Yes, but as long as the old features remain supported, and that’s the whole point. And since you took Python as a reference, the point is that the Python codebase has to be updated.
The existing codebase should compile safely, with a well defined behavior. And this requires the old features to be accurately described in a reference document, say the standard.
Again, you were initially talking about Python, which has nothing like “editions” and has introduced backward incompatible changes in the worst possible way (silent changes of behavior).
Now, if you are talking about Rust it’s different. I agree that the approach used in Rust looks interesting (on the other hand, Rust has not a 60+ years history), and that one should try finding a way in Fortran to retain the old features (up to a certain point) without hindering evolution.
Other than a default implicit none, what incompatible changes would you suggest? Restrictions on storage order or aliasing? Do any obsolete features of Fortran interfere with desirable improvements?
The standards committee has made some breaking changes, but I think they know that compiler writers will not remove those features from the compiler. If you want to use maintained versions of Python and especially its libraries, you may periodically need to change your code to keep it working.
No necessarily. For NumPy, one can require a version with numpy<2.0 in the build system. The same holds for the Python version, most libraries have a requirement like python_requires = >= 3.8 because they want to use newer features. There is also the option to specify an upper bound, for example because Python deprecates functionality from the standard library regularly.
I agree that practically, Python code is regularly updated. But with enough time for the updates, clear plans, and modern tooling (testing, CI) this seems to be fine for most Python developers. Probably the benefits of new features outweighs the obvious disadvantages of breaking changes.
yes, but I don’t see a reason why a compiler vendor should not simply say -std=f95 expects code that agrees to the standard published in 1995 and -std=f2023 does the same for the current version.
implicit none is certainly the most prominent example. Besides that, I would say that
using () for array access is very uncommon. Unfortunately, [] are already used for co-arrays, but considering that almost every Fortran code uses arrays while co-arrays are hard to spot in the wild, a migration to [] for arrays and {} for co-arrays would be a big improvement for most users.
having enum and enumeration type is confusing.
! for comments makes it impossible to have the de-facto standard != for inequality which in turn means that in-place division /= cannot be implemented.
I think that the kind selection for integers/reals is very intuitive and relying on IEEE 754 would make sense.
The definition of pure function is less strict in comparison to the common use (Pure function - Wikipedia). Adjusting the requirements instead of introducing simple procedures would have been the better option.
Removing obsolescent features instead of keeping an ever growing list. These features can of course still be used if an older version of the standard is requested.
Having case-sensitive code (or even unicode support, see Julia) would make it easier to find good variable names.
Besides the concrete examples, I think the main take away message from Python (and other software ecosystems) is that people can deal with breaking changes, especially if they are clearly communicated. Therefore, I absolutely disagree with the “breaking changes will be the death of Fortran” attitude. Breaking changes come with a cost, but this cost is not infinite.
Do you have an example for that? My impression was that breaking changes in Python are clearly communicated.
But if even breaking changes introduced in the worst possible way does not stop people from using Python, why the fear that well-communicated and carefully selected breaking changes would be a problem for Fortran?
numpy 1 is maintained (that is, at best a couple of years)
you don’t need libraries that have themselves switched to numpy 2
Two problems:
The standard doesn’t address the compilation options at all.
Formally, the Fortran 95 standard doesn’t exist anymore, it has been superseded by the Fortran 2003 standard, then 2008, and so on.
This is by far the craziest change suggestion that I could read . Doesn’t look a good idea to me, because it will make impossible the use of {} to delimit blocks of code
People can deal with a lot of things when they don’t have a choice. I can see an anology with consumer tech devices : vendors make often devices artificially obsolescent by stopping the support earlier than technically needed. As a consequence people tend to renew the equipment faster than really needed: yes they deal with it, still it doesn’t make a lot of sense.
Flu is harmless to young and healthy people. But it can kill someone who is already weak because of another pathology.
There are two types of backward incompatible change:
the old code won’t compile or will issue an error/crash at execution
the old code will compile and run apparently fine, but will produce wrong results (silent change of behavior).
True. This statement is a good opportunity to come back to the original topic: Python (and C++ and Julia) challenges Fortran successfully in its core domain. So apparently people are not happy with Fortran. My personal opinion is that backward compatibility at all costs is something that weakens Fortran, while you and other believe that it strengthens Fortran. There will never be a definitive answer to this dispute because the topic is too complex, but the example of Python shows that breaking changes alone are not the reason for a programming language to become unattractive.
This is not a law of nature, it’s a man-made decision that can be revised.
Is it that simple? A recent discussion here was pointing a blog article showing that the majority of CPU time on supercomputer was codes written in Fortran.
There’s a fact, however: there exists a huge legacy codebase in Fortran, this legacy codebase is still heavily used, and this is an important reason why Fortran is still alive of as today. So, one has to be extremely careful with any decision that could make these codes harder to compile and use.
Plus, the feedback from the committee members is usually that the main bottleneck that prevents the standard from evolving faster in terms of new features is the lack of manpower to do the actual work in the committee, and also that the compilers are generally lagging behind anyway (and the ones who do not are not really proactive to propose extensions that could be proven good candidates for standardization, as it was happening in a distant past).
It’s not to say that the backward compatility requirement has no impact, but I definitely thing it is often overrated compared to other reasons.
I agree, and something similar to the Rust editions would be desirable.