Is the Fortran Standard crucial for the future of the Fortran language?

hmm, maybe if we improve Fortran’s C/C++ binding we wouldn’t have to reinvent the wheel. We could just use those C/C++ libraries that we want and boost the capabilities of Fortran that way ?

1 Like

That is a short term solution for working scientists and engineers. But the educational benefit I’m after treats Fortran as the implementation language for increasingly sophisticated DSLs for elementary and undergrad mathematics.

2 Likes

baby steps first :slight_smile:

I recommend using SymEngine (GitHub - symengine/symengine: SymEngine is a fast symbolic manipulation library, written in C++), which has Fortran wrappers here: GitHub - symengine/symengine.f90: Fortran wrappers of SymEngine. It should be faster than GiNaC and other such codes like Mathematica, at least according to our benchmarks a few years ago.

(Disclaimer: I am the original author.)

I spent a lot of time ensuring that is the case, designing the data structures. Then I reused them and tailored them to a compiler, and that is why LFortran is very fast.

4 Likes

I should get to trying this out :+1:

One alternative would be for the community to create an open source NeoFortran which advanced more freely relative to standards and other features (like backward compatibility), and let the users decide…

I think there is some programming contexts for which strict standards are important, but for most scientific computing they are not, scientists are trying to solve problems and will pick the most convenient tool.

6 Likes

isn’t LFortran gonna be something like this ? @certik

1 Like

Indeed, I am hoping LFortran to become exactly such a community compiler.

We are between alpha and beta. If anyone in this thread is interested in helping out, we’ll get to beta and production faster.

3 Likes

Standards are absolutely critical for the future of any language.

How else do we know why code cannot be compiled?

If the code is not valid, it can be fixed. If the code is valid, but not supported by that compiler, we can seek an alternative. In this respect, the ACM SIGPLAN notes were very useful, since they indicated the levels to which each compiler was standard conforming.

Compiler extensions are useful, and contribute to the progress of the language, but we do need a standard.

$0.02.

3 Likes

Strict standards promote program portability between implementations. In languages where there is effectively a single de facto implementation, you don’t need a standard.

I heard that at one time there were over 40 implementations of Fortran. Running the same programs on different implementations was important.

Especially when 40 dwindled down to a much smaller number.

2 Likes

Strict standards are only valuable if they are good standards. If you have a standard that sucks, people will yolo their own non-standard compliant versions.

3 Likes

Well, the data on hand today is with several US government agencies, supposedly the only portable option is “F77” which was neither strict nor all that useful on its own!

Yet they appear to allow Python which has no standard!

Well, I too wish the Fortran standard was easier to understand and absorb.

And with all the cool things going on with static semantic modeling, it seems like the standard could use some formal rigor in places.

Standards are like money: They don’t bring happiness, but it is nice if you have some.
No, a Fortran standard is not crucial for the future of the language. Back when Fortran 77 was your only option, virtually all compilers provided extensions - some of them were so common that you could consider as given, some others were unique for a specific compiler. The language survived the “wild-west” era of F77 extensions. I’m sure, without having personal experience, that Fortran IV compilers also had their own extensions (and some of them made it to F77).
Fortran 90 provides many features that were very much needed before. One could argue that F90 provided a new, rich “standard” by itself, and that’s the reason Fortran survived - I personally think it would survive anyway, but F90 definitely helped. Extension do exist today, and I doubt they will ever cease to exist. You cannot expect a strict, universally adopted standard, ever (or at least anytime soon).

To be clear: I do want a Fortran standard, I just don’t think it is crucial. If there is something that’s crucial for the future of Fortran, that would be to let the hordes of misinformed people know today’s Fortran has nothing to do with what they think Fortran is (other than backwards compatibility). Just yesterday I met yet another scientist who had no idea. The crucial part for the future of the language is to stop the trend of promoting scientific computing with programming languages blatantly inadequate for the task.

3 Likes

Just curious, what features to you make a PL inadequate for scientific computing?

I could mention lots of them, although I am not sure this is the right thread to do so. It’s hard to make it short unless I totally skip reasoning. But at least I’ll try. So this is what I consider essential for scientific computing:

  • Compiled language, with a syntax that helps the compiler optimize the code. Interpreted languages (like Computer Algebra Systems or multi-paradigm languages) are good enough to quickly try a new idea, but when you realize the idea actually works, you usually need to do the rest of the job with a compiled language, because it is typically orders of magnitude faster.
  • Well-designed language that restrict the users from doing whatever they want with pointers and other structures. Pointers are not evil, they do have their uses, but using pointers for everything (and therefore giving the user the power to do whatever they want with them) is evil. It is a constant memory leak/segmentation fault threat, not to mention it makes it harder for the compiler to do its job optimizing the code. I do not want to run valgrind each time modifications are made in the code, simply because the language makes it so easy to create a hole somewhere (especially important when you work with others on the same code).
  • Decent array support. Scientific computing is literally full or matrices everywhere, and even non-scientific applications are full of them (like games or graphics in general). I don’t want a language that basically has no arrays at all, but rather a special use of pointers as “arrays”. Nor I want a language that does have arrays, but only “half-baked”, so you need to re-invent the wheel with templates and operator overloading in order to “teach” said language basic matrix arithmetics. Nor I want to rely on external libraries which do that, simply because the languages lacks those features. Instead, I would rather do the job with a language that does have such essential features built-in.
  • Proper decision-making features. For example, a case statement that you have to add break again and again for each case (otherwise execution flow will happily go to the next case instead of exiting) is, simply put, a broken feature. In 99.9% of the cases you want the break to be there and, whenever you don’t want that, the language should provide other decision-making statements for the task. Scientific computing is software, and software has bugs; the last thing I need is a broken feature that just adds another potential and completely unnecessary cause of bugs.
  • Any language where the standard way to declare a variable is the new command is out of question. I won’t even bother explaining why.
  • Object-oriented programming doesn’t simplify many scientific applications but, whenever it does, I want a language that doesn’t let you do whatever you want with class inheritance. This usually creates more problems than the ones it solves, and just stands on my way with unnecessary extra code maintenance.
  • In the vast majority of my work, I have to deal with complex number arithmetics. I don’t want to do that with a language that simply doesn’t know what a complex number is. And, again, I don’t want to patch the problem with an external library that does add these features, and may or may not work tomorrow or be compatible with my compiler version.

I could actually mention other reasons as well, but this is getting long already. Also, I avoided mentioning any specific language on purpose (although some of the reasons I mentioned do unavoidably “photograph” specific languages). Disagreeing with the above is perfectly fine, but arguing just to defend a language is pointless.

7 Likes

Is the Standard essential?

Probably not. If it were then Fortran would have died in the interregnum between FORTRAN 77 and Fortran 90.

But it is very useful in that it unambiguously defines how a sub-set of Fortran constructs should work.

Where the Standard falls short is that it is silent about many commonly used, or important legacy extensions.

I am also concerned that ideas may be added to the Standard without rigorous testing. When we were developing a Fortran-like simulation language in the 1980s our policy for new comands was:

i. Add it to the compiler

ii. Test all of our library of existing programs to see whether it broke anything.

iii. Write test cases to prove that it worked.

to what extent does the Fortran Standard work like this?

John

5 Likes

I think these are all good points.

I wanted to add a couple of comments about arrays (matrices actually) and complex variables.

Some languages do not support these directly, but they allow structures and operator overloading so that the programmer can write code that looks more or less like native language constructs. Here is the problem with that approach, and why it is important to have these common features supported directly in the language. Suppose a programmer defines such a structure and associated operators, and then goes on to write a library of standard mathematical routines based on those structures and operators. Then someone else writes a similar library with a different set of structures and operators. Suppose these libraries do some things similarly, but that they each also have their own subset of functionality, or different algorithm choices that scale differently for different problems.

Now another programmer wants to use those libraries to apply to his problem. Now he must work with BOTH sets of structures and libraries. Even if he takes the time to determine which routine from which library is best for his application, he must now constantly convert his data from one set of defined structues to the other in order to work with the appropriate library at the right time. If there are two libraries, then he probably needs to write coversion routines back and forth between those two conventions. If there are 3 libraries, then there will be 3 sets of such interconversions. If there are N libraries, then there will be (N*(N-1))/2 such sets of interconversions. If he doesn’t want to write conversion routines, then he must write mixed-structure operators, again with (N*(N-1))/2 possible sets requried.

In contrast, if the language supports those data types directly, then all of that effort is unnecessary. They can all use the same arrays (vectors, matrices, tensors) and the same complex numbers.

2 Likes

This is exactly why I think numbers without flexible generics systems aren’t suitable for scientific computing. Many modern optimization and solving methods want accurate derivatives, so being able to easily pass dual numbers to your functions is very useful. Similarly, being able to carry out algorithms in double double arithmetic and split complex numbers are very useful for extra precision and interval arithmetic respectively.

1 Like

This is what I called “re-inventing the wheel”. You described the issues with much more detail. Even if someone doesn’t care about the trouble and wants to go that way no matter what, it will never be the same as if the language had those features built-in.

And yet people do want to go that way anyway. I see more and more works written in languages that are far from being a good choice for scientific computing.
I have seen a series of books (which I won’t name) with program codes for Numerical Analysis going that way. I never liked those books, but many students considered them as “The Bible”. At least the code was written in Fortran 77 in older editions, then updated to Fortran 90 (among other choices). Latest version was available only in a… very popular language nowadays - but clearly not designed for Numerical Analysis, no matter how you stress the term. When I saw that I couldn’t believe it was happening… what a student would conclude after that? And the books were written by respected scientists… I am guessing they just did what it sells more.

And it gets even worse. I have seen people wanting to implement the whole Numerical Analysis code for their PhD in a Computer Algebra System. The initial code I have seen was working but it was slow. When I humbly commented that Fortran (or at least any compiled language) will do the job much better and it won’t be a “black box” of numerical codes as well, the response was what I heard many times after that: “today’s computers are fast enough, I don’t need to learn a compiled language”.

So, again, the crucial part for the future of Fortran is to let all those people know there is a better option. I think most of them… they are doing it wrong because they are following a trend, and they never bothered to question that trend.

3 Likes