What is the superiority of Fortran over alternative languages like Chapel or Julia?

Can you elaborate on your experience attempting numerical computing in C++?

Julia sounds promising but their array conventions threw me for a loop. Most other languages follow the convention of array access with () being 1 based, and [] being 0 based. Julia uses [] with 1 based indexing. Almost as bad as VB.Net using () with 0 based indexing.

Overall Fortran feel a lot more natural to me for creating mathematical models. But for anything else it is still very clunky.

Btw, there is a thread on these topics at the Julia discourse here:

I answered there a few times. To be fair to Julia I think they have improved a lot since Julia 1.0.0.

6 Likes

I don’t see the connection between using () or and 0- or 1-based indexing. R uses and 1-based indexing.

1 Like

I don’t use C++, but I have seen C++ programmers recommend Armadillo. Statistical algorithms rely on linear algebra, and a substantial fraction of base R code is in FORTRAN. (R is an open-source statistical language.) However, most new R packages that use compiled code use C or C++, using the Rcpp and RcppArmadillo packages. Hardly any R packages use modern Fortran. A 2015 thread in an R developers mailing list asserted that R packages using Fortran 90 are not portable. Machine learning libraries are written in C++ with Python and R interfaces.

Modern Fortran does have better support for multidimensional arrays than C++, but in some domains that heavily use arrays it is ignored.

3 Likes

A factor of two! Nim transpiles to C, and I thought C compilers have caught up with Fortran in speed. But a lot depends on actual code, ofcourse!

Python too survived without corporate backing, until it blew out exponentially.

Thanks for linking, this Julia thread provides some interesting outside perspectives on Fortran. The main fault identified with Fortran seems only to be a lack of tooling – fortunately, many in this community are working on improving exactly this :slight_smile:.

The few times I’ve tried out Julia projects, I’ve had to install a specific version of Julia to get the specific project working – as an immature language it’s still moving really fast. Maintaining packages in this fast-moving environment must be a lot of effort!

The other thing about Julia, mentioned in the linked thread, is that you can’t compile to a small stand-along executable – this is what stopped me when I first considered picking-up Julia.

3 Likes

As a Fortran and rencently Julia programmer, which admire both, I will give one example of one the fundamental qualities of Julia. For example, let us define a function that computes a “energy” between two sets of points (the sum of 1/d for every pair):

julia> function energy(v1,v2)
         u = 0
         for x in v1
           for y in v2
             u += 1 / norm(x-y)
           end
         end
         return u
       end
energy (generic function with 1 method)

This function is completely agnostic for the type of variable which is input. For example, if we have two one-dimensional sets of points, we would do:

julia> norm(x::Float64) = abs(x)
norm (generic function with 2 methods)

julia> v1 = rand(100); v2 = rand(100);

julia> energy(v1,v2)
309242.371744854

What we did is defining what norm means of the type of input we are providing. Now let us do the same for three-dimensional particles we invented now:

julia> struct Particle
         x::Float64
         y::Float64
         z::Float64
       end

julia> norm(p::Particle) = sqrt(p.x^2 + p.y^2 + p.z^2)
norm (generic function with 2 methods)

julia> import Base.-

julia> -(p1::Particle,p2::Particle) = Particle(p1.x-p2.x,p1.y-p2.y,p1.z-p2.z)
- (generic function with 198 methods)

julia> v1 = [ Particle(rand(3)...) for i in 1:100 ];

julia> v2 = [ Particle(rand(3)...) for j in 1:100 ];

julia> energy(v1,v2)
19187.000577283474

What we did there is to define the Particle type, define what does it mean to compute the norm and the subtraction of such particles, and that is enough to make the energy function, without any modification, to work. The above code is also performant. (as a side note, the indexing of the arrays is completely irrelevant there).

That possibility of writing generic code is a very powerful concept. When one writes code for a specific application, where all types of things are known, that might be not very interesting. But the Julia ecosystem thrives of implementations of algorithms which originally thought to a specific problem, can be applied with no modification whatsoever to a number of other applications.

As someone who came to admire Julia quite a lot, it is true that I fear the fact that its development is still dependent on a handful of very talented people, but still only a handful of people. I would not advice anyone to not learn Fortran if aiming to do high-performance numerical computing. But I would advice to also learn Julia. The mental model, issues and solutions that appear in each language help a lot to be a better coder in the other. And their syntax are similar (both very natural IMO).

The strictly objective advantages of Fortran over Julia that I can see now are:

  1. Generate small and standalone binaries.
  2. Once compiled, runs fast (no compilation overhead), which may be important for a program that has to be executed thousands of times.
  3. Better debug and error messages (type declarations surely help in this).
  4. It is likely that a beginner will obtain faster code in the first attempts to write something in Fortran than in Julia. In Julia one needs to learn some of a mental model about how to obtain type-stable code and avoid allocations (for example, in the above, removing the Float64 from Particle elements results in a code which is 60x slower). That has a learning curve, it is common that novices are frustrated with performance on their first attempts. But learning those concepts will help you to write better Fortran (and better Python, Java, etc as well).
4 Likes

Thanks for the example! There is no doubt that Julia’s high-level features are enticing. But I have the feeling that something comparable (though not identical) to your example is also achievable in Fortran. Unfortunately, I am too constrained by several deadlines right now to explore and offer viable Fortran solutions. Fortran’s metaprogramming features still have a lot of room for improvement. The elemental attribute, the static and dynamic polymorphisms, and parameterized derived types (as well as the Fortran preprocessor) are the only features that I remember right now to aid generic programming in Fortran. I know a subcommittee of Fortran is working on this issue right now for the next Fortran standard 2023 (@certik).

But aside from these examples, there are research problems that require a much more robust and mature language than the current state of Julia. For example, my research applications heavily require MPI parallelism. Fortran is the only language along with C that is officially supported by the MPI standard, for the past 2 decades. Not even C++ could not enter and remain viable in the standard for more than a few years. There have been several high-quality stable MPI libraries for Fortran for more than two decades and Coarray One-sided parallel communications in Fortran for more than a decade.

Now, compare the status of MPI/Coarray parallelism in Fortran with Julia’s MPI: a package developed by users and enthusiasts (not a standard committee) that currently does not even pass its own tests. How can I rely on such a state of parallelism for the work that I would hope to last for decades? I understand that this is just the beginning for Julia. But I personally cannot sacrifice my career and time for the sake of helping a new language to stand on its feet. (I am not biased against Julia, I simply cannot do it for any language, be it C, Fortran, Python, …).

Having a stable language and interface means a lot to researchers, at least to me. I have had heated arguments with some Python package developers who carelessly change the API of a package used by over 100K users across the world, without realizing that the minor interface-breaking changes they introduce cost society millions of dollars. It is easy math (in one case, I estimated the cost for a single-character change in the syntax of a Python package to be ~$8,000,000, equivalent to ~200 years of developer time). Rather than continuing to argue with the Python community, I decided to not implement any serious work in Python and not rely on any third-party Python package for any work that should last.

What if the same happens in Julia (which seems to be happening; take @avx macro as an example)?
What if the MPI.jl developers suddenly abandon the project? I know Julia people would say, no, that will not happen. But look at the reality of other famous packages in other languages. and look at the reality of the status of MPI.jl on GitHub.

Over the years, I have seriously tried to move on to other languages for software development and daily research. But the experience described by @difference-scheme on this page repeats itself again and again.

7 Likes

I do not disagree on any of that. I certainly think that you are in more solid grounds with Fortran. The software I developed which is most used (Packmol) is in Fortran (pretty ugly written, by the way), and time passes by, it is almost 20 years old now. There it is, I am completely sure that every Fortran compiler compiles it.

I always think that the thing that the Julia people got wrong, is that they could have done what they did with Fortran itself :slight_smile: . I would be of course more restrained, but I think that porting the most important features (multiple-dispatch) of Julia to Fortran is not impossible. Fortran would become something else (would become Julia), but with backwards compatibility, as multiple disabling dispatch on type-annotated code (a request that appears now and then in the Julia community) would make it Fortran-like.

On the other side, I did move from Fortran to Julia in my research over the last year and a half. The fun part of programming caught me. And the tooling (building documentation, benchmarking, etc, led me to write code that is faster, cleaner, easier to distribute and more maintainable - by me - than what I have ever written in Fortran). But what I most sincerely think is that learning Julia is a wonderful learning experience. The range of things that one learns from high to low level does not have par in any other language, and results from the nature of Julia itself being this high-level yet high-performance oriented language.

4 Likes

[Fortran] is suddenly hot again. But its future is still far from certain
One of the oldest coding languages is going through something of a renaissance. But can it really hold its own against newer options?
by Liam Tung | May 5, 2021
ZDNet

Clune says the two “must have” features the committee is working towards include exception handling and generic programming – features available in other languages.

However, execution handling has been dropped due to some major challenges on how to mesh exceptions with other Fortran features as well as some disagreements about specific details, according to Clune. And that next F202Y update might not arrive until the end of this decade – by which time Fortran will be 73 years old.

“It is difficult to predict when F202Y will actually happen. My hope is that it will be no later than 2030, and that I will have the opportunity to exploit generic programming in Fortran before I retire. And, yes, nine years is an eternity in the modern software world,” he says.

Fortran has had an incredible run, and still has its admirers. And with decades of Fortran code still running efficiently around the world, it’s not going to disappear any time soon. But unless it can catch up with its newer rivals, it may struggle to gain many new fans, or hold on to the ones it has now.

I pasted the conclusion of the article. Earlier, @certik is quoted at length.

3 Likes

What’s wrong with Julia’s MPI? I use it and it works fine. MPI is just a library, you can call it under C, Fortran, julia or whatever language you want, and it’ll do the same thing. Regarding future-proofing, can’t you just specify a version of the language and of the packages you use? It’ll work the same in ten years.

Will it implement the new future features? Does it currently have MPI 3? Does it currently pass its own unit tests? GitHub - JuliaParallel/MPI.jl: MPI wrappers for Julia

It appears it mostly does: Add Scatterv example based on #469 (#470) · JuliaParallel/MPI.jl@a584b85 · GitHub it’s not unusual when testing for tens of platforms to have some of them fail (for instance because the platform is badly configured)

MPI.jl is just a set of wrappers to get nice functionality (eg not have to specify the sizes of the arrays): if you want functionality not supported by MPI.jl, you can just call the MPI library directly, as if you were in Fortran.

@shahmoradi , you’re raising excellent points.

However, as you will know well, there will be tremendous energy and zeal on the side of new language entrants, particularly in the FOSS domain, that you will get every reply in the affirmative now!!

The unanswerable questions pertain to the situation 5, 10, … 25, … 50, …75 years from now … to even begin comparing with Fortran! Or what else comes along to catch all the attention of the next batch of enthusiasts!

It’s probably worth noting that MPI.jl has been used on one of NERSC’s supercomputers (with 650,000 cores), so it already works well enough for real world work (Parallel Supercomputing for Astronomy - Julia Computing). It is also being used as part of the CLIMA project (next gen climate modeling)

1 Like

I cannot think of any language with a stronger presence and with a more passionate user base than C++. This is the fate of its MPI binding: C++ MPI standard 3 - Stack Overflow
Again, I am delighted to see Julia making progress. But if you ask me right now to move my research projects to Julia, I’d rather wait for Julia to pass its test of time as @FortranFan explained.

Julia also has a direct way of accessing C MPI binding. With this mindset, why would one even want to use MPI.jl to have the fate of C++ packages that relied on C++ binding for a decade (and then revised back to C bindings)?

I think the discussion we have here is never-ending. I believe standardization and stability are the number one priority and some others on this page believe otherwise. and that is totally fine, from my perspective.

2 Likes

If we could distribute binaries easily that would be a not-so-bad option. But expecting users to download a years-old version of the interpreter and run the software within it having to find perhaps the documentation of the old version and “unlearn” what is newer is not realistic.

I think Julia is fantastic and as I mentioned I moved my research to it recently. But I do expect to have to maintain the packages carefully in the following years. Even the dependency management is still far from ideal with most packages under the 1.0 releases. Still I think that it will payoff, I see many many advantages, but it is still a bet.

1 Like