Eliminate implicit mapping

I don’t have much to add beyond what I have said earlier on this implicit typing issue. In short, I think backward compatibility with legacy code is more important that others here in this discussion.

As for the burden of including implicit none in subroutines and interface blocks now, that argument doesn’t really make sense. If implicit none were the standard, then all variables would need to be explicitly typed. But as I have pointed out before, if all variables are explicitly typed, then it doesn’t matter if implicit none is there or what is the default implicit mapping, the codes will compile the same either way. Implicit none is more for code development, not necessarily for the final product.

Regarding the f77 subset comments above, I did use several fortran compilers in the early 1980s that did not have double precision. The FPS and Cray fortran compilers were two of them. The FPS compiler was otherwise a pretty good f77 compiler. Instead of a full compiler manual, they gave their customers a copy of the ANSI standard document along with a smaller manual with limitations and extensions. My memory might be fuzzy, but I think it did support complex, so it was perhaps best described as somewhere between the subset and the full f77 language. The Cray compiler was not f77 in the 1970s and early 1980s. It supported dozens of extensions, mostly high-level versions of the cray vector instructions to make efficient compilation easier. There were no characters and no open statements, for example. Sometime later, about 1989 or so, Cray did eventually have an f77 compiler on their Cray cpus. I remember it being a little slower than their mature compiler at first, but I used it anyway because it compiled my codes with fewer modifications (which were done with a preprocessor).

2 Likes

First, I don’t know what Cray compilers from the 80’s you were using didn’t have double precision because i remember that all of the ones I used did at least on the XMP systems. Second, with implicit typing in place no amount of explicit typing of variables will catch the r0 (zero) versus ro (oh) kinds of typos. For me this is the major reason I oppose implicit typing. Without something like implicit none you can explicitly type every variable under the sun and still not catch those errors.

1 Like

Yes, I used Cray XMP and YMP machines in the late 80s, and by then the compilers did support 128-bit double precision arithmetic. It was about 20x slower then single precision because it was software emulated and did not use vector instructions, but as you say, it was there. That was about the same time that the first CFT77 compilers were becoming available. Maybe someone else can fill in the details about the existence of double precision (128-bit) on the earlier machines. I remember, for example single precision MXV(), MXM(), MXVA(), and MXMA() routines on the Cray, but I do not remember double precision versions. I wrote fortran equivalents of those library routines in the mid 1980s to facilitate porting codes to and from the Cray.

I also remember that you could use compiler flags to get REAL, REAL*4, REAL*8, and DOUBLE PRECISION to all map to 64-bit single precision, or some to single and some to double precision. Of course, those options were there for porting codes from other machines to the Cray. I also remember a time when Cray programmers wrote code almost entirely in lower case. At that time, most fortran code was written in upper case, perhaps with comments in lower case or a few other exceptions, so whenever you saw a bunch of lower-case fortran code, you could be pretty sure it had either been written on a Cray or had been recently ported to a Cray.

The other interesting thing about the Cray computers of that era is that the programmer had to work with three different operating systems, COS, CTSS, and UNICOS. That was remarkable because there were only a few dozen of these machines in existence. The fortran compilers were all similar, and probably shared a lot of basic code, but the underlying operating system resulted in machine dependent code nonetheless. For example, before open statements were available in CFT77, the association of a fortran unit number with an external file was different for all three of those cases. In my codes, I handled that machine dependence with a preprocessor.

This is an almost nonsensical argument. The idea that people so resistant to change that they are writing and maintaining ancient FORTRAN 77 spaghetti code with implicit typing are then going to totally abandon Fortran and rewrite everything because they might have to add a single command line argument to their build process? Seriously? Does that make any sense? It does not to me.

I could say more here, but it’s all already been said. Very disappointing, but not surprising. It doesn’t seem like we can look to the Fortran Committee for any bold vision to rejuvenate Fortran. We can do a lot (and have been doing a lot) in building a community, ecosystem, and tools, but without the same vision for updating the language itself, it’s going to be very difficult.

4 Likes

I don’t personally care one way or another about default implicit mapping, but the tone of the response sucks. Sorry on your behalf.

None of my compilers have a “-u” option. I wonder which one I should buy to become one of the referenced “customers”?

3 Likes

It has been mentioned that the fpm package manager can enforce IMPLICIT NONE as the default if the contributors want to do that. This would “lead the way”, and if it works well, then there would be evidence that the language standard could be changed.

That is the way many changes have been made in the past in fortran (and other languages). Someone implements something, it gets used by programmers, and they can then voice their opinion.

So here is a suggestion. Why don’t you make fpm work that way for some period of time. Say six months, or a year. Then change the default to be consistent with standard fortran for a similar amount of time. Then we can see who complains the most, those who use legacy codes that require extra effort to use, or those who are writing new codes and require portability within the language standard and require extra effort for their codes to work.

So there is the challenge.

Do it, gather statistics, and see what happens. See which group cries the loudest. And then step back and look to see how much all of that turmoil has helped or hurt the language. I think I know the answer to that, but why not look at actual data instead of just opinions?

2 Likes

:slight_smile: it took me a while to figure out what that meant, anyways here is the link where “-u” is applicable, if you want to procure a license.

While the need for implicit none and the implicit save on initialization are a bit quirky, they do not really stop people from getting things done once they know about these quirks.

So I think deprecating them and maybe issuing a warning per default may already be sufficient (Disclaimer: neither have I’ve been involved in the design of compilers nor talked with a large number of stakeholders. So I am aware that my competence is below average in this forum), and the discussions should not become to heated.

I think things like the lack of real generic programming, a standard library or namespace-like structures my be more important for the decision for or against a language applied for a solution.

I am aware and thankful that a lot of people in this forum address these issues and think that from a users perspective the choice of good (i.e., fast stable and to a lare extent standard-conforming) compilers (often free) as well a libraries has never been so good ).

This is not accurate. The quirks do indeed delay or prevent the evolution of or diminish the adoption of the Fortran language and its ecosystem in quite a few domains.

First consider this comment on another, unrelated thread but where you will notice the Fortran binding to another library has been nearly 3 years in the waiting and yet not completed.

The need for implicit none in interface bodies is one among quite a few factors that delay or prevent the developments of C/Fortran bindings for quite a few useful libraries but it is a crucial one nonetheless. It adds to the pain and the need for implicit none everywhere does tend to become the “final straw” where bindings that are useful to consume solutions in Fortran or use Fortran solutions in other applications are put off indefinitely.

Facilities or features delayed are facilities or features denied, period.

A few developers do end up authoring interfaces without the implicit none and get away with the lurking bugs and errors for a while but circumstances change and with no standard way to address the semantics, the issues surface at the most inopportune time. kernel32.f90 by Intel, Inc. as part of Intel One API for Windows OS is but an example of this: the users of Windows OS who have Intel Fortran can look in <C:\Program Files (x86)\IntelSWTools\compilers_and_libraries\windows\compiler\include> folder of their installation for this. This module has no IMPLICIT statement: over the years, we have had issues with this though most were unrelated to implicit mapping. But several teams started rolling their “own” and without IMPLICIT NONE and did run into errors. Over time, a myriad of problems led teams to migrate away from Fortran. But the “bad experience” with implicit mapping remained with them. Bindings with Windows OS APIs is just one case, there are many others that are dormant or have been discontinued. It all leads to the application of Fortran to be rather unproductive.

The fact is “implicit mapping” in Fortran is

  1. inadequate as a feature,
  2. highly bug-prone as a practice, and
  3. too harmful to the language impression and its standing and reputation.

At this stage, it adds nothing useful to the language, it only hurts.

Moreover the proposal to eliminate implicit mapping as shown in the original post does not require too much time nor effort to implement, it does not come in the way of other advances in the language such as Generics, if anything it will only be beneficial to all future standard language extensions generally.

Additionally the proposal allows \geq 10 years for legacy codebases to adapt which is on top of the nearly 50 years of knowing implicit mapping is not the way to develop codebases.

To eliminate implicit mapping is a “low-hanging fruit”.

It is only the counterarguments in the name of “backwards compatibility” and facetious and fictitious problems of breaking legacy code, almost all of which are likely already “broken” with respect to every standard revision starting ANSI FORTRAN 1966, that get “heated”.

2 Likes

Thanks for your detailed explanation. Although I used Fortran for several years, it has only been in a three/four person team so I do not really have experience in projects with a large teams and lots of external dependencies.
So my statements come from my experience in one medium-sized and more or less self-contained project.

If we can’t get rid of implicit typing/mapping, can we at least have the ability to put implicit none before any USE statements (as in directly after a subroutine, function, or module statement). For Linux/Unix users, writing a bash script that can apply sed to every file in a directory and insert implicit none or your favorite variant of an implicit statement after every subroutine or function statement it finds is fairly trivial.

2 Likes

Certain Fortran compilers support automatic generation of C prototypes (see my open issue on compilers supporting generation of C function prototypes which unfortunately received no replies) for Fortran procedures (and interfaces) marked with bind(c). Presumably, this shouldn’t be too hard to implement in a compiler. I guess only few Fortran teams have reached the level of build automation (and influence) to lobby for this.

The other way round, from C functions to Fortran interfaces, is trickier. Personally I’d like to see it solved by the means of C++11 and soon C23 generalized attribute sequences. Using the attribute sequences the compiler could emit the correct Fortran interfaces (including implicit none) when passed the right compiler flag (-fc-prototypes).

The Rcpp package for seamless integration of C++ and R already uses attribute sequences to export C++ functions to R. Here’s one example:

#include <Rcpp.h>
using namespace Rcpp;

// [[Rcpp::export]]
double meanC(NumericVector x) {
  int n = x.size();
  double total = 0;

  for(int i = 0; i < n; ++i) {
    total += x[i];
  }
  return total / n;
}

Having this built in to the compiler seems like a more robust approach than what third-party tools including SWIG-Fortran and Shroud offer. I will note that these tools go one step further than just interoperable C interface, but are also able to wrap C++ classes and other non-interoperable constructs.

Unfortunately, I’m under the impressions that the Fortran community is too detached from the C and C++ communities to make this a reality. Perhaps my impression is wrong and there are people actively working on this, which would be of great comfort to me. In a few discussions with fellow C++ colleagues, they were pessimistic about this idea receiving any official C++ support (either symbolic from the committee or practical from vendors). Many programming language communities (this Fortran Discourse included) tend to adopt language-centric and exceptionalistic views. There have been dark periods of human history where language-speakers have been prejudiced, banned, and even eliminated. Perhaps the title of this thread would be served better by the word deprecate than eliminate. Interestingly, the use of deprecate to describe obsolescent technology is quite recent.

2 Likes

Is this document covering all issues considered for F202Y or just a subset? I read it and am a bit surprised that the committee has not found any single topic to be “highly desirable”. So if these are really all topics being considered, 4 years after last version was released, the reasons for the committe very existence seem weak.

BTW, of moderate and medium desirability, which one is higher?

I believe it is also true that it would get new customers for some other compilers. So as a member of the committee I can say that this statement does not represent me.

3 Likes

I am afraid the answer to this too shall be a thundering “no” from the influential.

And it’s not just USE, there is also IMPORT which has to precede any IMPLICIT statements. It’s not easy to script a solution and it gets extremely bug-prone to use with real codes.

In the past, several managers / leads have asked me to point to codes or documents or papers, etc. from anyone influential with the committee that advises “go ahead, use implicit mapping”, that it’s a seriously a good way to develop Fortran codes and maintain and extend them. I failed to point any. Can anyone here?

For decades now, everywhere one sees the recommendation to use “implicit none”, the last set of books one team had all advised the same. Naturally the management asks when many other developers complain whenever they need to do anything with Fortran code, why doesn’t the standard make this recommendation the default?

I suppose another way to look at that paper might be in terms of a tacitly conveyed message that, just as with COBOL, there are codebases out there with only a few people who have the knowledgebase to maintain them and who can at anytime join The Great Resignation.

And that for these valuable few, FORTRAN = “implicit mapping”.

And that a change to the standard to be rid of this is too much of a change, it is a major “cultural shock” on top of everything going on everywhere with too many rapid shifts and revolutions.

Thus even if the change to implicit mapping makes no difference in actual practice to them, all their processors will support their codes by default, the very notion of a change on paper will lead to crucial exodus.

Funnily, anyone and everyone who actually foots the bill in one form or the other for these codebases, usually the taxpayer, otherwise the customer, wouldn’t mind at all if those legacy codebases were to move past “the 60s” which arguably represent the intense use of implicit mapping in FORTRAN; by the next decade, the need for IMPLICIT NONE was recognized and MIL standard for FORTRAN even adopted it by 1978.

Not only does FORTRAN = implicit typing for many, it also equals fixed format source files. I find it interesting that those who oppose making implicit typeing at least obsolete in the standard choose to ignore the fact that fixed format source was made obsolete in the Fortran 95 standard (some 25 years ago now if I remember correctly). I don’t remember our sun going supernova, Ragnorok did not descend upon Asgard, and the Four Horsemen of the Apocalypse did not go riding through the Fortran community. Unlike Mr. Spock in the Star Trek movie were he saves Kirk’s life but is exposed to lethal radiation, many on the standards committee seem to think that “The needs of the few or of the one outweigh the needs of the many”

1 Like

Great questions and comments.

No the paper I linked upthread addresses about a third of the 202Y items.

One other review for an additional third of the items is this:

The rest are pending in terms of a documented take on them.

@sblionel, who is also the WG5 convenor, grouped together the items for the subteams to review based on a variety of feedback including those provided by the Community here as collected by @certik in this thread. The currently tabled list is https://j3-fortran.org/doc/year/22/22-176r5.pdf

I simply can’t thank @certik enough for the tremendous Community engagement here and elsewhere. And to @sblionel to facilitate and open up and maintain and drive the various communication and feedback channels with the WG5/J3 working groups. There are all invaluable contributions to Fortran.

As to Fortran 202Y, noticeably Generics is the highly “desired” feature list. Everything else require considerable convincing and very strong “use cases” to move up the “desirable” index!

I’m curious if your managers show the same level of scrutiny also towards other programming languages?

If I take Google’s C++ style guide as an example, they mention quite explicitly:

  • Avoid surprising or dangerous constructs C++ has features that are more surprising or dangerous than one might think at a glance. Some style guide restrictions are in place to prevent falling into these pitfalls. There is a high bar for style guide waivers on such restrictions, because waiving such rules often directly risks compromising program correctness.
  • Avoid constructs that our average C++ programmer would find tricky or hard to maintain C++ has features that may not be generally appropriate because of the complexity they introduce to the code. In widely used code, it may be more acceptable to use trickier language constructs, because any benefits of more complex implementation are multiplied widely by usage, and the cost in understanding the complexity does not need to be paid again when working with new portions of the codebase. When in doubt, waivers to rules of this type can be sought by asking your project leads. This is specifically important for our codebase because code ownership and team membership changes over time: even if everyone that works with some piece of code currently understands it, such understanding is not guaranteed to hold a few years from now.

I’ve read dozens of articles on the internet about avoiding exceptions, avoiding templates, even about avoiding the standard template library… And yet, hundred or even dozens of programming teams continue to use C++. Rightfully so. It’s a great language when used correctly.

That doesn’t mean it’s immune from “legacy” problems too. There was a recent tweet showing a slide from Martin Berzin’s talk at PASC22, which contained the following statement:

Legacy or “clever” C++ is as more of a performance problem as FORTRAN ever was…

You could say we are seeing a revolt against C++ with the appearance of all the new languages aiming to replace C++ including D, Rust, and most recently Carbon. I think the true revolt against Fortran occured already back in the era when MATLAB, C, and other high-level programmers appeared. In essence Fortran was a victim of it’s own success paving the path for other compiled and dynamic languages.

1 Like

The managers themselves usually don’t have to do much of it because the workflows themselves support the scrutiny and review and maintenance and upgrades and rewamp in so many ways even in scientific and technical software. C++ and other C family of codes constantly go through the churn and get refactored and/or retired, things are intense in a way.

Ostensibly there is none of the holding onto some C++98 feature -“over my dead body” type - when C++20 offers better ways to do the same or achieve enhanced overall functionality.

Consider the blog by @jacobwilliams on BISLOC in NASTRAN a while ago:
https://degenerateconic.com/binary-search.html

With other more popular programming languages among the respondents to IEEE Spectrum survey, somehow the equivalent of old codes like BISLOC get revised at a greater frequency and they don’t get so far behind that someone who looks at it later has the same reactions as @jacobwilliams in the above blog!

One vendor I know had to refactor a lot of their extensive GUI code for engineering calculations when C++ standard decided to remove auto_ptr a while ago and end-user custom input/output forms became too difficult to support with newer versions of Microsoft Visual Studio. They just used the occasion to also make a lot of other changes and moved on.

1 Like