Anecdotal Fortran... :-)

Yes, it seems here “FORTRAN” just means “coding”! Which means it is perceived by the author as the father of modern coding. The word “Fortran” appears only one time in the document, in a paragraph about LISP: “only FORTRAN is older.”
Thanks for finding this document. Apart some personnel thoughts about the Logos, I never had the idea of searching requests like “Fortran theology” or “Fortran religion”, “Fortran divinity” (or with any other computing language) in search engines… That could bring surprising and/or interesting things…

The word “subroutine” may not be used in many other programming languages, but it definitely is part of popular knowledge, regarding computer programming in general, as witnessed by its use in TV series like Star Trek.
(There is another, flimsy, connection between Fortran and theology: I know of a Dutch theologician called Arjan Markus, indeed the difference in nomenclature is small, very small)

2 Likes

Since single-precision reals are used, the last two digits on every line are suspect. Even on the line with “true”, the last two digits are wrong (75 instead of 83).

The 1980 album Sandinista by The Clash has a song called Silicone on Sapphire that mentions the word “subroutine”. The lyrics otherwise seem to be systems programming technobabble, but I may be missing a deeper meaning if there is one.

Lyrics

Have you ever asked yourself
Who holds the key that winds up Big Ben?
Silicone on Sapphire
Connection
My prerogative is zero
When is your start
What is your data
Databus
Databus
I’m pushing your breakpoints
Anytime Mike
Know my subroutine
Motorola exorsizer
Modem connecting
In sync
Buffer
Handshaking
Throughput
Mnemonic code
I have your sentences right
Go ahead
Macro command
Yes
This is my micro instruction
Improper request
Output failed
Request debug
Improper request
Request debug
System debug freeze
Your memory is volatile
Freeze
Log ? add this is my address bus
Log add
Kill
Kill
?
Rub out
You’re on system interconnect
You are typing into my memory
Shift, shift, shift
That’s better
Now my decoder
I request your zero variable storage
I am a Texas Instrument
Clear, overrun
My zero positive
Truth table
Connection
Give me your input
Vector interrupt
Erase function
Vector interrupt
Go to RAM, Go to RAM
Go yourself
Go to RAM
I take it back?
Your memory is volatile
Your inputs, are deprived
Save, save
Erase bridge?
Go to outputs
Large scale integration
No source statements
Give me, give me flowchart
All died on call? databus
Hardware, firmware
Inhibit, inhibit, overflow
Yes. Hardwired logic. Machine language
Connection deprived by request, request
Parallel operation
Give me push count stack
I must have your address first
Take your datalog recharge
Hello, hello
System debug freeze
Clear restore and exit
Exit all done

3 Likes

Charlotte Froese Fischer (1929-), states on GitHub that her

PhD supervisor was Douglas R. Hartree who assigned to me the problem of solving the “equations with exchange” on the EDSAC computer at Cambridge.

She earned her PhD in 1957. One of her projects, updated 2 years ago, is Object Oriented GRASP, a

general relativistic atomic structure software organized as the processing of objects that is both more readable and manages memory more efficiently. Written in Fortran 2008 or later, it relies extensively on parameterized user-defined data types for defining objects and linked lists for managing memory of lists whose size cannot be determined in advance.

I admire someone who stays at the cutting edge of scientific computing for six decades.

7 Likes

Douglas R. Hartree

That’s the Hartree for which the atomic energy unit (~ 27.2 eV) is named after, and half of the namesake of the Hartree–Fock method, which is still in popular use in quantum chemistry even until now.

Yes, that Hartree. She wrote a biography Douglas Rayner Hartree: His Life in Science and Computing, and her Reminiscences at the Turn of the Century (a pdf file) mention him. Below is an excerpt.

Then in 1960, I was offered a summer position at the Boeing Airplane Company, Renton,
Washington, where I was allowed to pursue my own research and use their advanced computing facilities. These included not only the latest IBM computer but also a FORTRAN compiler. For the first time, I saw there was a chance for portability of computer programs, and I began to develop numerical methods for atomic structure calculations. The following year, I was offered a summer position by David Layzer at the Harvard College Observatory, where the Smithsonian Institute
also had excellent computing facilities that soon included an IBM 7090.

My research has revolved around the rapidly evolving computer technology. Unfortunately, this also meant that research results could soon become obsolete. I remember well that Douglas Hartree, in his last few years, had developed an elegant scheme for getting good initial estimates of wavefunctions so that the selfconsistent field method used for solving the Hartree-Fock equations would converge in several fewer iterations. My thesis was related to this problem. But I never implemented any of these ideas into a computer program: by that time the computers were fast enough that a simple, but general, scheme was more convenient, even if the calculation required a few more iterations. From this experience, I learned that it was important to have a long-term view, that the human factor of “ease of use” was extremely important. Early programs often were written to implement a single method, and all too often would terminate with `method failed’ when solving nonlinear problems. It was far more productive to have the program try a series of approaches before giving up. In other words, the program should encapsulate my knowledge about the solution of the problem and try a number of alternative methods if the first one failed. This was not always implemented in practice, but was my goal.

3 Likes

And Fortran, Fortran So Far Away by Remy Porter in Feature Articles on 2021-09-29

A surprising amount of the world runs on FORTRAN. That’s not to say that huge quantities of new FORTRAN are getting written, though it’s far from a dead language, but that there are vital libraries written fifty years ago that are still used to this day.

But the world in which that FORTRAN was written and the world in which we live today is wildly different. Which brings us to the story of George and Ike.

In the late 1960s, the company that Ike worked for got a brand-spanking new CDC 6600 mainframe. At the time, it was the fastest computer you could purchase, with a blistering 3MFLOPS performance- 3 million floating point operations per second. The company wanted to hand this off to their developers to do all sorts of fancy numerical simulations with FORTRAN, but there was just one problem: they wanted to do a lot of new programs, and the vendor-supplied compiler took a sadly long time to do its work. As they were internally billing CPU time at $0.10/second, teams were finding it quite expensive to do their work.

CDC 6600.jc.jpg
By Jitze Couperus - Link

Enter Ike. Ike was a genius. Ike saw this problem, and then saw a solution. That solution was 700,000 lines of CDC 6600 assembly language which was his own, custom, FORTRAN compiler. It used half as much memory, ran many times faster , and could generate amazingly user-friendly error messages. Ike’s compiler became their internal standard.

1 Like

Bloomberg:

The Nobel Physics Prize went to three scientists working on complex systems like climate change. Syukuro Manabe, a meteorologist at Princeton, and Klaus Hasselmann from the Max Planck Institute for Meteorology won half the prize for the physical modelling of Earth’s climate, quantifying variability and predicting global warming. Giorgio Parisi from Rome’s Sapienza University got the other half for discovering the interplay of disorder and fluctuations in physical systems from atomic to planetary scales.

A 1967 paper by Manabe and Wetherald, Thermal Equilibrium of the Atmosphere with a Given Distribution of Relative Humidity, used a model written in Fortran, estimating that “a doubling of the CO2 content in the atmosphere has the effect of raising the temperature of the atmosphere (whose relative humidity is fixed) by about 2C.” According to this source, current estimates are that doubling CO2 would raise temperatures by 1-2.5C short term and 1.5-4.5C long term.

Hasselman also programmed in Fortran. Parisi has co-authored papers with GPU computation, but I don’t which programming language was used.

4 Likes

In 1896, Arrhenius had calculated that a 50% CO2 rise would give a 3.4°C rise:
image
Svante August Arrhenius, “On the Influence of Carbonic Acid in the Air upon the Temperature of the Ground”, London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science (fifth series), April 1896. vol 41, pp. 237-275.

Same with COBOL:

A wonderful cartoon:

1 Like

fortran-xkcd

from @interkosmos

An fpm example project written in Fortran 2008 that displays the latest xkcd comic inside an X window. As a limitation, only images in PNG format are supported (no JPEG). The alt text will be printed to console.

The program depends on X11, libcairo, libcurl, as well as the following Fortran libraries:

You can pass the xkcd number to view a specific comic strip:

$ ./xkcd 292

Screen Shot

screen shot

10 Likes

While doing some search about Fortran today, I found a company named “Fortran Communications”. Does it have any relation with the Fortran we know?

Video of 5th grade children (age 11-12) in Philadelphia learning Fortran (how to compute the volume of a sphere) in 1967. The children seem engaged, one boy asking whether parentheses are needed in

v = 4*pi*r**3/3

More than 50 years later, my children are only exposed to Scratch in 8th grade :frowning:

3 Likes

The Simpsons S07E23 (Much Apu About Nothing):

(Video)

apu

1 Like

That’s a nice find! If just he used the columns 72-80 to sort the punch cards :joy:

An Interview with the Old Man of Floating-Point

Reminiscences elicited from William Kahan by Charles Severance

20 Feb. 1998

This interview underlies an abbreviated version to appear in the March 1998 issue of IEEE Computer .

Introduction

If you were a programmer of floating-point computations on different computers in the 1960’s and 1970’s, you had to cope with a wide variety of floating-point hardware. Each line of computers supported its own range and precision for its floating point numbers, and rounded off arithmetic operations in its own peculiar way. While these differences posed annoying problems, a more challenging problem arose from perplexities that a particular arithmetic could throw up. Some of one fast computer’s numbers behaved as non-zeros during comparison and addition but as zeros during multiplication and division; before a variable could be used safely as a divisor it had to be multiplied by 1.0 and then compared with zero. But another fast computer would trap on overflow if certain of its numbers were multiplied by 1.0 although they were not yet so big that they could not grow bigger by addition. ( This computer also had nonzero numbers so tiny that dividing them by themselves would overflow.) On another computer, multiplying a number by 1.0 could lop off its last four bits. Most computers could get zero from X - Y although X and Y were different; a few computers could get zero even though X and Y were huge and different.

Arithmetic aberrations like these were not derided as bugs; they were “features” of computers too important commercially for programmers to ignore. Programmers coped by inventing bizarre tricks like inserting an assignment " X = (X + X) - X " into critical spots in a program that would otherwise have delivered grossly inaccurate results on a few aberrant computers. And then there were aberrant compilers … .

“Reliable portable numerical software was becoming more expensive to develop than anyone but AT&T and the Pentagon could afford. As microprocessors began to proliferate, so did fears that some day soon nobody would be able to afford it.”

3 Likes

John von Neumann, when he first heard about FORTRAN in 1954, was unimpressed and asked “why would you want more than machine language?”

Sources:
[1] Computing at Columbia Timeline
[2] Why would you want more than machine language? | Deta Blog
[3] References for "The Future of Programming"

4 Likes

John von Neumann didn’t need more than the machine language (“when he was six years old, he could divide two eight-digit numbers in his head”). But the rest of us do.

2 Likes

Perhaps, after reflection, von Neumann identified an algorithm more efficient (e.g., by number of steps, by amount of working memory, etc.) to him than the approach taken by many others, though his name does not appear (yet) in Wikipedia’s (possibly incomplete) compilation. Maybe there are parallels to the story «What is the sum of all integers in the range of 1…100?» which (so the saying) CF Gauss didn’t solve by many additions, but by a single multiplication(reference).

1 Like