Automatic differentiation of Fortran code, opinions?

Also, this paper was very readable & informative to me (to get the basic understanding of what is forward/reverse AD modes etc…)

Now I’ve been learning how to use AD from time to time, and it seems very useful indeed (I was amazed that the differentiation of log det(A) etc can be done very straightforwardly (e.g. by PyTorch), though it may be not surprising for people familiar with AD.)

For some of my codes, some AD tools were needed. Therefore, years ago, I’ve developed part of those tools for a specific application. But more recently, I’ve started to set up some AD fortran library:

Right now, this library is working for scalar multivariate functions and up to the third derivatives. It is based on overloading the operators (±/= …) and all intrinsic fortran functions (sin, cos, exp …).
The library is still part of a code, but it should be relatively easy to extract.

Another library is dealing with matrix, but it is less complet.


For some matrix/vector operations (addition, multiplication …), it will work, although, there are some more efficient procedures. However, a diagonalization is more tricky due to the iterative procedure. Anyway, they are procedures to get derivatives of the eigenvalues and eigenvectors.

@shahmoradi, first of all, welcome to the community!

If you are interested, I am happy to write a prototype in LFortran, we have an issue open for it:

Let me know if you would be interested in working on this together. If so, let’s brainstorm how it should work.


I’ll be happy to be part of the discussion about AD in particular on adding a new type and overloading operator and intrinsic functions. However, I’m not sure about the meaning of ASR!

1 Like

Hi @lanast,

I have been through many of the Fortran options at, and as others have mentioned, the vast majority of them are no longer available or are limited to old subsets of Fortran.

May I ask which form of AD you require, Forward or Backward? For Forward mode, there are several OO libraries available, such as in flibs.

The only option for Backward mode that I am aware of is Tapenade which I can recommend highly. It is robust and works well with a wide-subset of modern Fortran. I can confirm that it can be integrated with a makefile quite nicely when installed locally. Moreover, the source-code transformation approach to AD will always provide the best performance since you can exploit compiler optimization of derivative code. Tapenade used to be closed-source and licensed for commercial use, but it now looks like it’s open source under MIT which is brilliant!

May I ask which BLAS/LAPACK operations you need derivatives for @ivanpribec? My understanding of this is that it isn’t usually a good idea to differentiate such routines, rather it is better to use the normal BLAS/LAPACK operations to implement the analytic derivative. I would recommend the following report on this topic: An extended collection of matrix derivative results for forward and reverse mode algorithmic differentiation. I’ve applied this successfully in a simple MATLAB potential flow solver to get reverse mode derivatives from the matrix inverse op.


I’m working on a dryer optimization problem. To calculate the objective function I need to solve a 1D diffusion equation discretized with finite differences in space and the Crank-Nicholson method in time. This leads to a tridiagonal matrix problem which I solve using dgtsv. Right now I’m using a derivative-free optimization code, but I’m not happy with the convergence rate. Thank you for the recommendation.

I see, so this is an adjoint problem then. I think you just need to solve the transpose system to back-propagate your sensitivities (see sec 2.3.1 in above referece); this will also be tridiagonal so you can still use dgtsv for the backward problem.

1 Like

Perfect. @gardhor, will you be able to join our monthly Fortran call?

Fortran Monthly Call: October 2020

If so, let’s discuss it there.

When the October fortran call is due?

It will be in about 45 min if you can make it, see the link above for connection information.

I first wrote an ad implementation about 15 years ago. In the recent lockdown I decided to re-implement it to be more extensible to higher derivatives. Currently it supports up to and including 4th:

my autodiff


@simong excellent, thanks for posting it!

We discussed this at our latest call, summary is here: Fortran Monthly Call: October 2020. Can you please also post your code to this issue:

That way we can include it at the new “autodiff” section at our website that we just agreed at the call to create.

1 Like

Have just posted as requested.


Thanks, @certik for the link. This would be great! Aside from isolate usages, AD is absolutely required to popularize Fortran in the Machine Learning community.


I’ve just added an automatic differentiation library on githhub.


It uses forward mode up to the third derivative and as many independent variable.


FortranCalculus (FC) compiler is available and free. Download it at my website, . Its roots start at NASA’s Apollo Space program that got us to the moon and back. The first language, developed around 1960 by TRW, was named Slang. It has 10+ solvers stored in a library that a user can call by name. These solvers used Automatic Differentiation (AD) to calculate the Jacobian and hessian matrices. For more, please read my profile.

I’m 77 years old and ready to retire. Anyone interested to taking on my website? My website has some 5+ Apps showing the power of FC and the FC compiler & manuals. The average download customer downloads ~4 Apps per day and the number of customers over the last year (2022) ranges from 50 to 90 downloads per customer! China is the number one country of interest. They are from 5 to 30 times the number of USA customers. They must be involved in Optimization.


There have been 3 AD based compilers that I’m a where of. NASA’s Apollo Space program had need for faster compilers as their engineers kept changing their minds (see the movie hidden figures?). So NASA hired TRW to write a new type compiler where solvers were stored in a library and required no development. Around 1960, TRW developed the first AD based compiler, Slang, that dropped solving problems to days instead of years. The first commercial AD compiler, PROSE, was made available in 1974 on CDC computers. Today there is a version named FortranCalculus that became available around 1990. They all 3 compilers handle Optimization problem nicely.

1 Like

Slang, PROSE, and FortranCalculus are AD based compilers.

1 Like

Hi Phil,

welcome to Discourse! I couldn’t find anything about AD-supporting compilers on the NASA Technical Reports Server ( Who do the initials TRW stand for?

Have you compared FortranCalculus with any AD frameworks in languages that are popular today, like Python, MATLAB or Julia? Do you still operate a company based around your (extended) Fortran compiler?

1 Like