For over four decades, Dongarra has been the primary implementor or principal investigator for many libraries such as LINPACK, BLAS, LAPACK, ScaLAPACK, PLASMA, MAGMA, and SLATE. These libraries have been written for single processors, parallel computers, multicore nodes, and multiple GPUs per node. His software libraries are used, practically universally, for high performance scientific and engineering computation on machines ranging from laptops to the worldâ€™s fastest supercomputers.

The current BALLISTIC (Basic ALgebra LIbraries for Sustainable Technology with Interdisciplinary Collaboration) project for which he is principal investigator is in C.

Dongarra was interviewed about his prize. I think LFortran aims to have some of the functionality of Julia.

While hardware speeds up matrix multiplication, Dongarra is, again, mindful of the needs of the scientists and the software writer. â€śI grew up writing FORTRAN, and today we have much better mechanismsâ€ť such as the Julia programming language and Jupyter Notebooks.

Whatâ€™s needed now, he said, are more ways to â€śexpress those computations in an easy way,â€ť meaning, linear algebra computations such as matrix multiplications. Specifically, more tools are needed to abstract the details. â€śMaking the scientist more productive is the right way to go,â€ť he said.

Asked what software programming paradigm should perhaps take over, Dongarra suggested the Julia language is one good candidate, and MatLab is a good example of the kind of thing thatâ€™s needed. It all comes back to the tension that Dongarra has navigated for fifty years: â€śI want to easily express things, and get the performance of the underlying hardware.â€ť

As far as what interests him now as a scientist, Dongarra is schooling himself in the various machine learning AI techniques built on top of all that linear algebra code. He has a strong belief in the benefits that will come from AI for engineering and science.

â€śMachine learning is a tremendous tool to help solve scientific problems,â€ť said Dongarra. â€śWeâ€™re just at the beginning of understanding how we can use AI and machine learning to help with that.â€ť

â€śItâ€™s not going to solve our problems,â€ť he said, â€śitâ€™s going to be the thing that helps us solve our problems.â€ť

We will ensure the longevity of BALLISTIC beyond the individual tasks and will continue to incorporate new software standards and update current software standards by: (1) using the OpenMP standard for multithreading to ensure portability between current and future multi-core architectures (e.g., Intel x86, ARM, IBM POWER); (2) using the OpenMP standard for accelerator offload to guarantee portability between current and future hardware accelerators (e.g., NVIDIA, AMD, and Intel); (3) ensuring that all code is compliant with recent language standards C (C11 and C18) and C++ (C++11, C++14, and C++17) and providing standard-compliant Fortran (2003, 2008, and 2018) interfaces; and (4) by maximizing the use of standard numerical library components like BLAS.

I find it slightly paradoxical that Dongarra praises both MATLAB and Julia, while the Julia inventors developed the language partially out of frustration with MATLAB.

The Julia inventors were frustrated with Matlab because it did certain things so well. If it was universally bad, they would have never been frustrated with it in the first place because they wouldnâ€™t be thinking about it at all.

By the way, linking directly to HackerNews will cause the linked conversation to get downranked. They have ways to detect people coming into a conversation through a link, and if a bunch of people show up and vote, it gets detected as a voting ring. So itâ€™s best to just say â€śthereâ€™s a post about this on HackerNews with title ____â€ť

Yeah, this is a good description. Despite some of the truly insane decisions Matlab made (1 function per file, indexing out of bounds expands, etc), it is one of the cleaner languages for doing linear algebra.

A 1987 paper by Dongarra and Grosse, Distribution of Mathematical Software via Electronic Mail, highlighted in this tweet, may give some ideas for how codes should be distributed today. I am impressed by the R package system. In R, install.packages("foo") not only gets foo but any packages foo depends on.

Well thatâ€™s another massive PR blow to Fortran, when the creator of some of the most widely-used Fortran libraries in history is telling people they should be using Julia or Matlab nowadays.

One reason Julia and Matlab are recommended is that they have REPLs. I see that @certik has just moved from LANL to a new position at GSI Technology, where he will have more time to work on LFortran, which has a REPL. Best wishes to him.

The last few years I have been working very hard to create a modern Fortran compiler called LFortran. The way we have designed it is so that you can target the intermediate representation, that we call the Abstract Semantic Representation (ASR), from other languages also. It is a compiler toolchain that any array language can target. I got an opportunity to join GSI Technology to focus full time to work on this toolchain, to make their APU chips easier to program. We are also developing a new Python frontend called LPython. Both LFortran and LPython are effectively thin frontends that do parsing to a language-specific Abstract Syntax Tree (AST) and then transform AST to a language-independent ASR by applying and checking the surface-language semantics.

Both LPython and LFortran compilers will remain independent open-source projects with no particular specialization to any particular array chip. I am joining a very good compiler team, and it is an opportunity to deliver on this broader compiler project as well as LFortran and LPython in particular.

If you are interested in collaborating on any of these projects, please let me know. We have intern and other opportunities available.

Thanks @Beliavsky. Indeed I changed jobs to focus on the compiler full time. I think Fortran can be saved, but we need to bring it on par with Julia, Matlab and Python with regards to â€śI want to easily express things, and get the performance of the underlying hardware.â€ť I think LFortran can fix this, once we can compile most projects. If anyone here reading this wants to help, let me know!

It is very unlikely that Fortran will be as interactive as Julia or Matlab. Even if Fortran adds a REPL, I highly suspect that will make a difference. C++ has Cling but how many people actually use it?
The primary selling point of Fortran is simplicity and speed.

Fortran has been ignoring interactivity since 1980, while Matlab, then Python, and now Julia chip away at the use cases for using Fortran. The excuse that Fortran was a compiled language and thus would always be faster than any newfangled REPL is no longer really true. And frankly, being 5% faster is totally irrelevant for many many use cases. If the goal of Fortran is to just keep the lights on for legacy supercomputer codes at national labs (until they finally replace it all with C++) then letâ€™s keep doing what we are doing. But if the goal is to be the go-to language for numerical and scientific computing, then something has to change, because that is not where we are, folks.

Even if like you said 'The primary selling point of Fortran is simplicity and speed.`, the way I see it, is that this exactly lays the foundation for the usefulness of Fortran to be interactive. If Fortran is slow and difficult then there is no point to make it interactive at the first place.

The primary selling point of Fortran is simplicity and speed.

Yes, (personally) I found that Fortran is simple compared to C++, and every time my idea of switching to C++ is stopped by this simple argument (one of the many points). Further, doing science with Fortran will not change the science with C++.

Having simplicity in the code leads to better understanding of the program flow and modification/updates in the future.

TIOBE April 2022: Fortranâ€™s rank dropped from 30 to 31 and rating from 0.39% to 0.35%.

Good old MATLAB is about to drop out of the top 20 for the first time in more than 10 years. The MATLAB programming language is mainly used in the numerical analysis domain. It is often combined with Simulink models, which are from the same MathWorks company. Although MATLAB has a biannual release cycle, the language doesnâ€™t evolve that much. And since MATLAB licenses are rather expensive, alternatives are catching up quickly now. Its main competitors are Python (currently number 1) and Julia (moving from position 32 to position 26 this month).