@ivanpribec also sent me that paper:
Marcin Paprzycki and Janusz Zalewski. 1997. Parallel computing in Ada: an overview and critique. Ada Lett. XVII, 2 (March/April 1997), 55–62. doi:10.1145/249100.249112
The Ada 83 standard was already offering parallel features. But it seems each language follow its own destiny, which come from its original goal. In the case of Ada, it is real-time systems, reliability and safety.
In a paper I wrote in 1985 (https://cds.cern.ch/record/1049891/files/dd-84-7.pdf) I saw ADA as a serious contender to displace Fortran.
Mike
From “We got around three” by Raymond Chen (Microsoft):
That reminded me a story back from the days when Microsoft sold Fortran PowerStation, the development suite for the scientific programming language. A developer from the Windows 95 team decided to try a change of pace and switched into sales and marketing, and he wound up working on Fortran PowerStation.
One of his attempts to drum up interest was to include a reply card in a periodic mailing to subscribers of some general computer programming magazine. This was back in the days when people subscribed to computer programming magazines, like, in print. Also back in the days when the publishers would send out little playing-card-deck-sized blocks of business reply cards for various products.
As he explained it to us, “We sent out around ten thousand cards. They say that a typical response rate for a direct mailing is four to five percent. We got three.”
Okay, three percent is a bit low, but given that Fortran is not exactly an up-and-coming product, it’s still quite respectable.
“No, not three percent. Three responses.”
I found this tweet with a video of Bjarne Stroustrup, creator of C++, talking about Fortran worth sharing here. Thanks to Discourse being able to embed other pages, you just need to press the play button ( ).
The full interview with Bjarne and Lex Fridman can be found here: https://www.youtube.com/watch?v=uTxRF5ag27A
A minor point in Fortran’s favor is that since it is a distinctive name and not a regular English word, searching “Fortran” is much less likely to give extraneous results than searching say “C”. (I fear that although the next Fortran standard will allow functions to be declared simple
(a subset of pure
), searching for info about “Fortran simple function” will not give relevant information.)
Putting together the [Top Programming Languages list of IEEE] has also made one other aspect of programming languages clear to us: Computer languages have terrible names.
Things started out so well with Fortran and Cobol—brief yet euphonious names rooted in descriptors of language’s purpose: formula translator, business language. Sadly, by the late 1960s, the rot had set in. BCPL arrived, its name a brute acronym for Basic Combined Programming Language, four words that conspire to give no information about the nature of the language or its purpose. BCPL begat B. And B begat C. C itself is a staggering accomplishment, a milestone on every timeline of computing. But its name must be considered a stain on its incredible legacy.
For C begat the even greater nominative monstrosity of C++. This made it acceptable to incorporate symbols, a tradition continued with names like C# and F#. But perhaps even worse is the alternate fashion of just using common nouns as names, for example, Rust, Ruby, and Scheme. Some forgiveness can be given for a borrowed name that’s unlikely to cause a semantic collision in normal use, such as Python or Lisp. But there can be none for such abominations as Processing or Go. These are words so often used in computing contexts that not even a regex match pattern written by God could disambiguate all the indexing and search collisions.
Consequently, some of the metrics that compose the TPL require many hours of handwork to clean up the data (hence our strong feelings). Some languages have their signal so swamped by semantic collisions that their popularity is likely being underestimated.
Fortran: Still Compiling After All These Years
By Doug Eadline
HPCWire
September 20, 2023
A recent article appearing in EDN (Electrical Design News) points out that on this day, September 20, 1954, the first Fortran program ran on a mainframe computer. Originally developed by IBM, Fortran (or FORmula TRANslation System) was designed for scientific and engineering applications and subsequently came to dominate scientific computing. It has been used for over seven decades in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics, geophysics, computational physics, crystallography, and computational chemistry.
…
David Chase is a Go compiler developer and he gives his “unpopular opinion” here:
You can read the full transcript here: A deep dive into Go's stack with Yarden Laifenfeld & David Chase (Go Time #288) |> Changelog
I did not exactly understand the problem he adresses in Go but it deals with slices.
By the way, he said he interned for John Backus and had Dr. Fortran as advisor.
You can also read transcripts of other podcasts where Fortran is cited: Search results for fortran |> Changelog
He was referring to the default aliasing rules. By default Fortran assumes that dummy arguments to subprograms don’t alias. An example of this in action can be seen here: C Is Not a Low-level Language - #25 by ivanpribec
One may run into aliasing inhibiting optimization when using pointers in Fortran. See this topic for some guidelines what can be done in that case: Fortran pointers, and aliasing
Clearly, he did not mean me!
William H. Press, co-author of the Numerical Recipes series of books, originally in Fortran 77 and Pascal, has written an autobiography, More Than Curious: A Science Memoir, which can be downloaded. One chapter discusses the writing of Numerical Recipes.
Numerical Recipes in Fortran-90 was published with much fanfare in 1996. And it flopped. Over five years, it sold only ten thousand copies, a success by academic standards, but far from our dream of bringing parallel thinking to the masses. With a nod to Phar Lap, we had bet on the wrong horse. Danny Hillis’s certainty that programmers could not master the complexity of MIMD—lots of processors doing independent unsynchronized operations—was not quite right. MIMD was beyond most programmers, but the few who could master it could and did write libraries of reusable code that hid most of the details. These Message Passing Interface libraries were ugly. You didn’t really have to “think parallel” in any elegant way. You just had to chop up your problem into a bunch of little “threads” and send them to the MPI.
Data parallelism never completely died. Many aspects live on inside today’s GPUs—graphics processing units. SIMD might even someday return in processor chips with very long register lengths. But MPI became the dominant paradigm for the next twenty years. Thinking Machines Corporation went bankrupt. Sometimes the prettiest horse doesn’t win the race. Numerical Recipes in Fortran-90 has some of the best code I ever wrote. C.U.P. keeps it in print, but barely: In 2020, it sold exactly twenty copies. Jill Mesirov moved to MIT’s Whitehead Institute, later the Broad Institute. She was later my colleague on the IDA board.
The Numerical Recipes (NR) website doesn’t even offer the Fortran versions for sale anymore, so that will definitely hurt sales.
I wonder how much the success of NR was limited by the licenses placed on the code. On the Numerical Recipes website, they even suggest alternatives for finding code that can be distributed as source since, as they say, NR cannot:
When you need mathematical software that can be freely redistributed as source code (which Numerical Recipes can’t), a good place to look is Netlib. Within Netlib, the SLATEC package offers a comprehensive collection of (alas, Fortran only) source code.
At this point, why use the NR algorithms if you run the risk of a license violation if you use any of their code, which is likely because you are learning from their code?
I really am just wondering aloud on this, and I have no insight into the real reasons why MIMD didn’t take off.
We used Numerical Recipes (NR) as an auxiliary textbook in an excellent course by Prof. Simon Širca (this was before his own book on Computational Methods in Physics was published). I vaguely recall him characterizing NR as “great book, terrible code” . The amoeba
routine is my favorite one. I tend to revisit the book here and there.
I would have three comments about this. One is that the book Numerical Recipes in Fortran-90 was not a complete book, it only contained the f90 code. One still needed the previous version of the book to understand the algorithms. From my perspective, the most important parts of the books were the algorithm discussions, so to publish a book with just code was the wrong approach. I recommended the f77 version of the book to many people, too many to even guess. I never recommended the f90 version of the book to anyone.
My second comment is that the newer versions of the books, such as Numerical Recipies in C++ had new material, new algorithms, new discussions, etc. There was never a comparable fortran version that was published. I felt betrayed by that.
This brings me to the third comment. F90 was miles beyond f77 in its capability, but it still lacked many necessary and useful features. These were mostly filled in with subsequent releases, particularly f2003. If they had continued to publish updated versions of the fortran book, that adopted the useful new features of the language and that kept pace with the new algorithms that were published in the C++ book, then I expect they would have sold more than 20 copies.
I’m really sorry to say that, but truth is often ugly, and in this case, it certainly is. Numerical Recipes (whatever language version) is not a book I would recommend to anyone. I’m not going to question the fact the authors are great scientists, of course. But a good scientist is not necessarily a good book author. With all due respect, especially for S. L. Shapiro and S. A. Teukolsky (who wrote papers I used a lot in the past,) their book suffers from an identity crisis. It tries to aim for everything and, as it’s always the case when someone does that, it fails in everything.
It tries to be an educational book, but explains close to nothing about the numerical methods used. It tries to be a recipe book, but only offers rather elementary routines, definitely not suitable for “real-world” problems. This would be great if their purpose was to introduce the reader to Numerical Analysis, but then said methods should be explained in a reasonable amount of detail - which they aren’t.
The end result was a book that was used by students for one purpose only: to copy subroutines verbatim and add them in their code, knowing nothing about the methods used. The fact they ditched Fortran in the last edition (in favor of… C++, for devil’s sake) should be the last nail in the coffin for NR. Unsurprisingly, I’ve heard several people calling that one “the bible”. Give me a break…
Edit: Nothing is all white, nothing is all black - and NR is not an exception. The book does include several good advices, often given with a sense of humor. I wish I could say more positive things about NR than just that. But I can think of many Numerical Analysis books much better than this.
I actually used quite often the NR book (the F77 version) back in the days, but almost never copied a routine (maybe 2 or 3). I was more seeing it as a convenient source of informations about algorithms in various domain (I would rather name it “algorithmic recipes”, by the way). I admit that most of time the explanations are kind of superficial and definitely not at expert level, but taken as it is I found it useful. I also remember about a controversy where some numerical analysis experts had pointed some issues in the book.
I also had the f90 version, but let’s say that I was not convinced at all by their approach of f90.
I always wondered why ZX Spectrum - very popular in Europe back in the 80s - didn’t have a Fortran 77 compiler, given it was Zilog Z80-based (where CP/M was native.) It turns out there was a Fortran compiler, except nobody knew about it. It’s probably because the computer was quickly considered a gaming machine - despite the fact it had weak graphics and even worse sound capabilities. “Uncle” Clive Sinclair himself hated the fact the ZX Spectrum was used mainly for games.
Definitely a rare find, worth preserving. And that is said by a C64 user who never really liked the “Speccy”. Thank you @vmagnin for sharing this.
The compiler is here:
In the Documentation
directory you will find the user manual (only 15 pages!). The first sentence says it is based on FORTRAN 77S, a subset of FORTRAN 77. You will also find nice pictures of the cassette.
My first home computer was a Dragon 32, still in the attic of my parents. I don’t know if it still run. One of these days, I will try… I don’t know if there were a FORTRAN compiler. I was not aware of FORTRAN in those days, BASIC was King (and assembler was Prince…) The 6809 CPU was managing the video output, but with a Poke you could deactivate the video and the CPU was then computing 2x faster! (but you saw only snow on the TV screen ). In the typical video mode, you had four colors (my first colormap :-)).
And more information on MIRA FORTRAN:
Integer arithmetic proved to be fast — between 30 and 100 times quicker than BASIC and similar in speed to Mcoder and Softek’s IS integer BASIC compilers.
Floating-point number-crunching was not very fast. One demonstration routine searched for prime numbers, digging them out at a rate of about two a second. The method proved to be very inefficient, but the compiled code was only about twice as fast as equivalent ZX BASIC, or a fifth quicker than BASIC compiled with Mcoder 3.
(I don’t know why they talk about prime numbers about floating-points numbers…)