Is the Fortran Standard crucial for the future of the Fortran language?

There are many programming languages that are national or international standards, and there are many that aren’t. There are successful and thriving languages on both sides. Just curious what you think: is the Fortran standard crucial for the future of the Fortran language?

  • Yes
  • No

0 voters

1 Like

IMO, there’s pretty clear evidence that it already isn’t since if you look at what current compilers implement, none of them actually give you what the standard says they should.

2 Likes

No C++ compiler implements the C++ 2017 standard either. But they, like Fortran compilers, are moving closer to compliance (at varying speeds). At least with a standard we know what is a compiler bug and what isn’t.

What would the Fortran language look like without a standard? A different language for each compiler? I suspect language evolution would be even slower because implementors would be reluctant to implement features not found in other compilers, because their users would be reluctant to use them, because they wouldn’t be portable.

2 Likes

This isn’t too far from what fortran programming was like up through to f90. There were few compilers that themselves were portable in any real sense. Maybe f2c would count as a portable compiler. But most other fortran compilers were supported by hardware vendors, and they ran only on that one brand of machine. The standard itself was very limited, so vendors routinely added features as language extensions that their customers wanted. These included various ways to do things like direct access i/o, or even to associate unit numbers with external files. The fortran language itself did not include things like random number generators, or date and time query, and the mathemtical functions were limited to only the most common functions like trig functions, log, exp, and sqrt. Vendors then used these compiler-specific extensions to try to lock in their customers. If a customer (which might be an individual, or a university computer center, or a large corporation, or some government agency) had a large code, say 10^5 to 10^6 lines, filled throughout with vendor-specific extensions, and they contemplated changing hardware vendors, they had to cope with changing all of that code. The more the customers used those extensions, the more they were locked in, and the more the vendors liked it.

From the programmers point of view, it was actually difficult to write portable code that accomplished nontrivial tasks for those reasons. One approach that was tried was developing fortran-like languages such as ratfor and sftran (there were others, those are the ones I used) that would compile to more or less standard fortran. The first LINPACK codes (and MATLAB I think) were written in something called TAMPR. In addition to higher-level constructs (if-then-else and select case, for example), these pseudolanguages allowed easy porting between single and double precision, something that was relatively difficult to do in standard fortran before f90 (with its KIND facility). However, the use of these kinds of tools complicated the whole code compilation process, and they made it difficult for programmers to share their codes with others because these tools were not universally popular.

There were programmers who programmed on machines for one vendor their whole career. Someone might have worked for 20+ years only on IBM mainframes, or only on CDC machines, and so on. Unless they made an effort, they were almost helpless when confronted with fortran from another vendor. I have probably 25 fortran manuals from various vendors still on my bookshelf from that time. I did try to write portable code, and I did try to keep up with all the language extensions, but it was difficult to really be fluent with more than a handful of compilers at a time.

Then somehow things changed, and compilers like g77, NAG, Absoft, and others became available that each ran on various hardware. In the late 1980s, even hardware vendors began to support g77 for example. I credit g77 and later gfortran for keeping the language afloat. I’m pretty sure that without those free compilers, the language would have not survived to the present time. Of course, there are still vendor-specific compilers too, the Intel compilers are examples, but they now support their companies architecture on a wider variety of machines than before, so even the vendor model is different now. IBM did that for a while too, supporting PowerPC machines from various vendors, but now I think they have reverted back toward their 1960s model.

So I do think that a fortran standard is an important part of the language viability now. Programmers are more concerned now with portability, partly because computing hardware is more of a commodity endeavor now and companies spring up and die off all the time, so they naturally avoid the vendor lock in. However, as is pointed out in the original post, the standard definition might not need to be an ANSI or an ISO standard. It might also be an industry standard, like IEEE, or it might be just a broadly supported community standard, like CMake to give one example. I think the fpm project is like the CMake model. In any case, I do support people working together, hashing out ideas, and hopefully moving forward. If the language effort becomes fragmented, it will die.

6 Likes

Those who can read know that these options only prevent you from using features not-in-standard but give no guarantee that you can use all features in-standard. I guess the latter is what @oscardssmith had in mind.

1 Like

Just to clarify: I voted “no”, it is not crucial. However, I believe that it’s very important.

4 Likes

I am old enough to remember these days The Classic Mechanic: MM, AF, BSW, BSF Spanner Conversion Chart Standards are pretty vital.

1 Like

I voted yes, because I feel standards add potability to code when used across different compilers. However I also feel standards should not be stagnant and probably be a little dynamic and keeping up with the times.

5 Likes

This comes down to, “For whom Fortran, for what?” If one limits the vision of Fortran to individual or small-team pursuits of programming toward one’s hobby or individual/small business or consulting or study or research, it can lead to one view which appears the most common among the readers who post here and who constitute but a miniscule of those doing programming. But what about others?

Related to the question in the original post, perhaps at a more fundamental level are the questions

  1. is an ISO IEC standard crucial for scientific and technical computing?
  2. is Fortran crucial for scientific and technical computing?

My own experience, limited of course to certain domains in industry and their interface with academia and independent/governmental research, indicates no, absolutely not, the standard is not crucial.

However there are certain considerations which come into play in practice e.g., the areas of industry where I have experience are driven (unsurprisingly to most) by considerations such as business profitability and agility, the latter related to aspects such as business growth, adapting to changing circumstances and various competing forces and new directions and markets, etc…

Development of scientific and technical solutions to an ISO IEC standard is generally seen to simplify the programming work practices and the work flow toward support and maintenance and continuous improvement of the computing solutions that help the business enterprise. The standard then becomes a good productivity aid in several aspects: consistency, portability, etc. are provided by the standard. As such, the programming practices strive to adhere to the standard as much as reasonable and any nonconformance becomes part of an exception process underdoing separate review and approval authorization though that can often be informal.

An ISO IEC standard that supports and evolves with the needs e.g., one that minimizes the items that go into the above exception list helps retain its usefulness and relevance. Should too many items end up being exceptions, the standard will obviously start to get overlooked and slide into irrelevance. Or get replaced by another standard.

Back to Fortran, the problem with its standard has been “too little, too slow”. As things stand, there are hardly any scientific and technical solutions today for which Fortran is anywhere close to being a crucial tool. This is the complete opposite of the scenario from 1960s thru’ 1980s when many viewed Fortran as the only option for them.

Any solution on any platform/domain of interest where Fortran can be the tool as a programming language can now in principle be approached by another programming language/paradigm that offers significant productivity and agility gains in some respect but can come with other costs (often due to complexity). This is especially true with modern C++ which too has an ISO IEC standard. The present benefit-cost dynamics has Fortran losing out considerably.

A Fortran standard that advances too slowly and insufficiently further erodes the few traditional advantages with the use of Fortran such as those with program performance and simpler and consistent and portable codebase. The outlook on agility as well as productivity by sticking with Fortran is getting poorer even as modern Fortran retains so much promise and elegance.

The slowness and the limited nature of feature sets with the newer Fortran standard revisions curtail the ability to adopt consistent and portable programming practices while pursuing higher and higher levels of abstraction in computing solutions. The latter is being driven by more complex demands e.g., solve bigger and more complicated problem sets than done previously; develop simulators and applications with more multiphysics considerations.

The Fortran standard and the language as a result risks becoming irrelevant.

What irks those in positions of influence and power around me is the sheer intransigence and reluctance with Fortran and the inconsistency when it comes to the standard with grabbing “low-hanging” fruits: achieve better type-safety in the practice of Fortran, either include more intrinsic facilities using the language or make it much easier to develop more performant and full-featured library solutions using the language.

There is no coherent vision with the standard revisions, it comes across as all ad hoc. There are then circular or catch-22 type of considerations in certain domains when it comes to the question in the original post, “is the Fortran standard crucial for the future of the Fortran language?”

3 Likes

If a hypothetical “New Fortran” were to suddenly start up and it had the same openness and backing for compiler and tooling development as we see in for instance Rust and Golang then I would have no problems with jumping ship from ISO Fortran. Hence I voted No, but this scenario seems highly unlikely.

5 Likes

Yes, but in m opinion, changes need to happen at a faster pace. 5-10 years gap between standards is too much for today. Others have frequently suggested in this forum, and I concur that a 3-year schedule would be ideal.

6 Likes

I agree that some common specs of the language (~ “standard”) will be very important when multiple compiles are developed by different groups, but I am afraid the “standardization-first” approach for directly making the “final” specs (= ISO standard) is very time-consuming and probably error-prone also (because the specs are determined by “thought experiment” only). I guess one of the reasons why some functionality takes so much time to be implemented is partly because of this. One more serious thing is that even if the final / published specs based on “thought experiment” turns out to have design issues later (via implementation or user experiences), there’s no way to modify it or easily deprecate it (because of “backward compatibility”). Old examples may include “implicit save”, and one more recent example might be enum. So, although I think some common standard will be useful / important once the specs are well established, the “standardization-first” approach without real implementation / user experiences / feedback seems to have a lot of pitfalls or potential downsides… (So I voted “No” in the poll.)

This situation may be a bit related to the latest issue on the Fortran stdlib?
Should we require a separate fpm package first before standardizing? · Issue #658 · fortran-lang/stdlib · GitHub

Also, depending on languages, experimental or preview implementations seem to precede before standardization. I am not familiar with C# but is it this pattern? (according to the version table below)

In other languages like D and Nim etc, new functionalities are often provided via compiler options, and the developers can gather user experiences / feedback for a pretty long time. If they turn out to be stable and useful, the functionalities are made “default” in the compiler. (FWIW, D has three compilers (dmd, gdc for GCC, ldc for LLVM) and dmd seems to provide the reference implementation.

1 Like

The intel Fortran compiler. Here it clearly states that the classic compiler supports the 2018 standard.

the newest one doesn’t support 2018 along with coarrays too (ie. it’s a work in progress).

I happened to be researching computer algebra algorithms and implementations this weekend that has some relevance to the original question.

Early work in symbolic computation was done in “High Level” languages such as Lisp, or the invention of a DSL – domain specific language (eg. REDUCE, Axiom, etc).

At the edges of mathematical and scientific research, the computational requirements for truly productive use of CAS algorithms has been constrained by limitations of the DSLs that lead to programming errors that are hard to track down.

This paper (PDF) describes some of the problems with computer algebra systems experienced by research physicists at Johannes-Gutenberg University, leading them to take the opposite approach to symbolic computing by extending an already standardized computer language (C++), and creating a library (GiNaC) that evaluates symbolic expressions. Their concept blurs the distinction between “symbolic” and “numerical” computation.

The GiNaC library is now the backbone for SAGE math. My thought is: why can’t Modern Fortran be a contender in this area? The very first Prolog implementation (for exploring symbolic computation) was completed in Fortran.

More info on the use of C++ for symbolic computation can be found at
https://www.ginac.de/.

Some thoughts on the problems of new languages without a spec can be found in this post that discusses Rust: Rust: Move Fast and Break Things

1 Like

I think a standard is crucial. But the specific ISO one…maybe not? It seems that the whole process, with two committees, international balloting, designing features before any compiler has implemented them or anyone has tried them, volunteers generating non-free documents, version control by uploading new ascii text documents to an ftp server, etc. leaves much to be desired. Would Fortran be better served by a better process? Or some other way to generate the standard?

9 Likes

I don’t really find that reference relevant for the discussion on the need for a formal (ISO type) language specification. Mostly he seems to be angry because his hardware architecture isn’t supported by Rust yet. He goes on to ramble a bit about breaking changes in Rust libraries, but gives no examples of which libraries cause problems. It’s easy to critique breaking changes in other language ecosystems when you write C/C++/Fortran because you hardly have any ecosystem to pick from. He posts no claims about breaking changes in the Rust language due to the lack of a complete specification document.

There is a case study of a cryptography library (initially done in C) being re-implemented in Rust, with a lengthy description of the problems that occurred. Gavin Howard: Rust, Zig, and the Futility of Replacing C.

The initial article discussed compilation toolchain complexity, which is a valid one IMO. This used to be a problem with C++ but after 30+ years, things have settled down a bit.

The Ginacs developers complained about DSL changes in various versions of Maple leading to many wasted hours tracking down errors due to dynamic typing. Their requirements called for a standardized language with which to extend with algebraic capacities, if you read the link to the paper (published in the Journal of Symbolic Computation).

I too have asked for some symbolic computational functionality in the past, but now that I think about it. I feel like maybe Fortran should do what it does best, that is numerical computation. What we do need however, is some sort of intrinsic equation parser to smooth out the transition from a CAS system like SymPy or Maxima to Fortran.

That being said, clicking on the link you provided for the ginac library. The website title is very clearly “GiNaC is not a CAS”…

The name is analogus to GNU – GNU is not Unix. The paper at the bottom of the page (linked to above) gives a rationale for the library approach to symbolic computation.

Blockquote
I too have asked for some symbolic computational functionality in the past, but now that I think about it. I feel like maybe Fortran should do what it does best, that is numerical computation.

In practice, the distinction between “symbolic” and “numerical” computation is not clear cut. The Ginacs library was used to embed symbolic capabilities in a “numerical” language like C++ to evaluate integrals symbolically before implementing a numerical approach.

I would like for Fortran to be an entry into combining mathematics with computing science and engineering. C and C++ are also important languages from a practical standpoint that will need to be learned, but Fortran is a much simpler language to work in, and more efficient (and stable) than Python.

The critical data structure for symbolic computation is the abstract synax tree. There is really nothing other than tradition (and Metcalf’s law) preventing the use of Fortran implementing libraries like Ginacs.

The role of Fortran in the development of Prolog is interesting from the perspective of unifying symbolic with numerical programming.

1 Like