What are the advantages of Julia over Fortran?

In the linked issue, LoopVectorization was substituting those definitions for SLEEFPirates.jl/trig.jl at master · JuliaSIMD/SLEEFPirates.jl · GitHub which are fully vectorized. I’ve looked for better scalar sin and cos implementations, but haven’t found much yet.

Here is an interesting post on some of the current drawbacks of Julia for numerical computing, specifically, ML. Although the post is from the perspective of a Python programmer, the Julia weaknesses listed in the comment are rather generic. I mention a few of them below to shed more light on some of Julia’s advantages listed in this thread.

(A) Poor documentation. I see many people complaining about this. Of course, this can happen to any language. But it reflects the state and quality of the “6000” packages in the Julia Ecosystem.

(B) Inscrutable compiler error messages, similar to C++.
(D) Low code quality, The original poster lists specific examples. This points again to the current low quality of the existing 6000 Julia packages.

Bugs also seem to be a serious issue plaguing the Julia compiler itself, which argues against the current philosophy of Julia of having a single compiler developed by a single company. Bugs do happen in all compilers. But when there is only one compiler available, it becomes a severe issue.
In effect, compiler diversity like those of Fortran and C++ seems to offer more benefits than the alternative of having a single compiler which makes it prone to single-point-of-failure (SPOF) catastrophes. It may be tempting to believe that such a SPOF does not happen to a team effort, like the Julia compiler development. But the reality speaks against Julia’s philosophy. There are already examples like this specific Julia instance, or more catastrophic external-examples like the Boeing 737 Max scandal. Diversity is good, even for compilers.

3 Likes

Here is the experience of one of the users of Julia’s Automatic Differentiation toolbox shared on Julia Discourse:

Like @patrick-kidger, I have been bit by incorrect gradient bugs in Zygote/ReverseDiff.jl. This cost me weeks of my life and has thoroughly shaken my confidence in the entire Julia AD landscape. As a result, I now avoid using Julia AD frameworks if I can. At a minimum, I cross-check all of their results against JAX… at which point I might as well just use the JAX implementation. (Excited to check out Diffractor.jl when it’s ready though!)

In all my years of working with PyTorch/TF/JAX I have not once encountered an incorrect gradient bug.

1 Like

The cited post does say

Even in the major well-known well-respected Julia packages, I see obvious cases of unused local variables, dead code branches that can never be reached, etc.

and given that there are 6000 packages I am sure the quality varies, but a general statement that the packages are “low quality” seems subjective and probably unfair. To make a supportable generalization I think one needs to perform a large-scale static analysis of Julia packages and quantify the frequency of various types of code defects.

6 Likes

The whole thread can do with a freeze for now, instead of veering further into the absurd.

The readers would do well to take a pause and can check in X years later, say when a “Hello World!” redistributable program is a reasonable possibility with Julia and when REPL options with Fortran are more widespread, this notwithstanding the fact comparisons are ultimately meaningless …

4 Likes

The quality of the Julia packages reflects both a strength and a weakness of the Julia model.

One the one hand, there are many packages in the registry because it is very easy to deploy a new package, and there is only one package manager, which is very good. That makes it easy for a developer to deploy a software.

On the other hand, that easiness allows many bad or even empty packages to exist in the registry, and can cause confusion for the general user. Existing in the registry is not a measure of anything, really, and users can perhaps think being registered is a certificate of minimum quality. It is not, and sometimes the most “natural” package name was taken by an abandoned package, and the “good” package for the same task has a different name, which is not necessarily easy to discover.

On that thread I think they were specifically referring to important packages of the ML ecosystem, and comparing to the big packages of the Python world, which are backed up by huge companies and do not have counterparts in any other language.

1 Like

R/CRAN has a more restrictive policy (but still has 20K+ packages!). I think the Fortran Packages list is trying to be more like CRAN, although anyone can post a Fortran code on GitHub with FPM build instructions.

1 Like

Probably for Julia such a model would not fit. Because of generics and multiple dispatch, actually having small packages is encouraged, because that allows for a great composability. Packages can be as small as this one. They sort of play the role of modules in other languages, providing specific functionality that can be used in other packages, independently of the type of object that the other package deals with. Other packages are interfaces between different packages, providing some specific bridging functionality. This composability is important in Julia, but it also causes functions to be more spread, and this is one criticism raised on that ML thread, relative to large monolitic packages available in Python that provide a self-consistent user experience, even if not really integrated with the language itself.

Note that IfElse.jl will unnecessary as of Julia 1.8.

1 Like

This thread on the same topic in the Julia discourse created by the OP himself might be relevant here. I very much liked this answer in that thread which explains it with a very good analogy:

2 Likes

@rashidrafeek I agree that Julia is currently far ahead of Fortran in terms of available packages and easiness of use. However, I think we can completely fix that for Fortran with fpm and the ecosystem of packages that people have been creating (although it will take some time to build the ecosystem), LFortran for interactive use, etc. So right now I think that post is correct about Julia vs Fortran, but I think things will change once we improve the tooling around Fortran. The differences then will become intrinsic language features pros and cons, and I think it’s early to make a judgement, but let’s just say I would not be surprised if a lot of people will find Fortran very attractive.

7 Likes

I personally did not find Julia has much advantage over Fortran in terms of performance and easiness of programing.

The only thing that slightly bothers me about Fortran is that people who developed the nice ODEPACK (mainly Alan Hindmarsh, etal), in the 1990s, decided to use C instead of Fortran for future new solvers. Those new solvers include CVODE etc and eventually evolved into SUNDIALS.
The reason, in the 1990s they decided to use C instead of Fortran in the future, is mainly because at that time it seems Fortran lacks some modern features as C did, such as modular design.

On page 2 it says,

It may be a lesson for Fortran.
My shallow understanding is that if in 80s - 90s Fortran was able to quickly adsorb modern features (especially the modular design) from competitive Language such as C, the CVODE and eventually SUNDIALS should still remain in Fortran instead of C.

Currently in my code I have FLINT (which has DOP54, DOP853, Verner65, Verner97), DVODE, RKC, and several Hairer’s solver such as radau5, radau, rodas, seulex, rock4/rock2, odex. Overall they works good. But I guess if I have to solve a system of hundreds/thousands ODEs and with sparse Jacobian, perhaps will need SUNDIALS.

1 Like

I don’t know the specifics, but it is true that it can be tough to write “generic” code in Fortran. That is one of the motivations for generics. While in Julia you can write quite generic code and reuse it.

Often times I found that people simply migrated from Fortran to C or C++, because those languages have been better supported in the past, in modern times it is typically the GPU support. I am hoping to make LFortran work very well on GPUs down the road.

2 Likes

metaprogramming & multiple dispatch.

3 Likes

As a fan of playing with lisp for fun but not really using it for real problems, I find it quite appealing that Julia has some lisp-like meta programming and multiple dispatch/multi method concepts while also being an array oriented numerical language.

Something I find a bit lacking in Fortran as a general math programming language (though perhaps not impacting on its primary numerical use cases!) is the ability to work at a higher level of abstraction. Eg being able to delay evaluation of expressions, use lambdas, write generic functions etc etc would be very cool but isn’t currently (easily) possible in Fortran as far as I’m aware.

I also think as modern numerical computing increasingly merges with ‘data science’ type problems (eg ‘data centric engineering’ etc) this sort of ability to do both concrete linear algebra/array computations as well as more abstract but still mathematical computation will become more relevant for Fortran. Julia is currently seeing adoption in both the ‘solve differential equations’ and ‘do data analysis’ areas, also leading to cool (depending on your taste :wink: ) synthesis projects like the ‘scientific machine learning’ stuff. Primarily in Julia as far as I can tell — eg https://sciml.ai/ — though similar initiatives likely exist in the Python ecosystem. I think this ability to work at higher levels of abstraction if needed plays some role in this, though I can’t really back that up beyond personal opinion.

5 Likes