Hi, I’m just interested in what most people are using FORTRAN for and I thought it might be fun to talk about.
I use it professionally mostly for writing CNC machine post-processors, but also other numerical utilities related to manufacturing. These are basically inverse-kinematic engines which take point-vectors and convert to machine kinematics between 2 and 7 axes. Naturally, FORTRAN is the best choice for its portability, easy to read/debug code, and very strong legacy in this field.
I have used Fortran for solving partial differential equations or integral equations arising in fluid mechanics, and for plotting their solutions. Also for plotting contours in the complex plane that avoided branch cuts. Also for testing compilers for suspected bugs.
Fortran is used extensively in areas like structural dynamics/shock physics and CFD. In particular, codes like the Sandia CTH hydrocode and the EPIC dynamic finite element code are used extensively inside US DoD labs and some NATO partners for blast and penetration analysis. These are still to the best of my knowledge almost entirely Fortran. I think that LSDYNA and ABAQUS FEM codes are also still mostly Fortran. Two of NASA’s workhorse CFD codes OVERFLOW and FUN3D are still mostly Fortran. I think that Fortran still has a large footprint in the CWO (Climate, Weather, Ocean) modeling community. The DoD HPC center I worked at a few years ago would keep statistics on the types of codes (and the language they were written in) that burned the most cycles on their systems (very large Cray and SGI systems at the time). In most years, Fortran codes would burn 65 to 70 percent of all the cycles on those systems.
Most recently I use it when I need to do some Jackknife re-sampling. A naive implementation in python was too slow so I did some python bindings to the Fortran code. It also gets used directly in some Fortran analysis code for scale setting in Lattice quantum chromodynamics.
I rewrote some of our old simulation code in Modern Fortran a couple of years ago from F77. I used co-arrays which was probably a mistake at the time. This code can be thought of as evolving the solution to a DE from some initial conditions on some background.
I have another project for doing Lattice QCD related things which is not just analysis. This is a bit of a side-project mainly to explore writing some modern Fortran from scratch.
My group uses Fortran + MPI to simulate binary alloy solidification on the microstructure level, coupled with fluid flow, magnetohydrodynamics, and structural mechanics. The codebase implements a Cellular Automata method for solidification simulation, a D3Q19 Lattice-Boltzmann method for fluid flow, and a thermoelectrics solver for TEMHD.
Simulating the microstructure of real industrial components can push the requirements for domain size to O(10B) cells and compute to 100s-2k cores, and IO to 100s of GBs per output.
The codebase was written in F90 but borrowed a lot from imperative F77-style. Architecture and design was whatever each developer believed to be good enough at the time.
I’ve spent some time modernizing and improving the codebase to include fpm, (some) stdlib, modern interfaces using derived types for MPI and states/solvers, modern configuration using YAML + pydantic, etc.
My one attempt to use Fortran involved yet another game port (from Basic) of my pet Trek clone. I stopped upon realizing it was a step back from most other versions and didn’t offer me anything new.
Although now retired, I used Fortran for a wide range of radiotherapy-related models, from response of tumour and normal tissue to radiation, right up to a national model of incidence of cancer and demand for radiotherapy equipment and staff.
I am now working on pressure logging in our local water supply, and a model of flow and pressure distibution to take evidence to our water supply company! All in Fortran of course.