Automatic differentiation of Fortran code, opinions?

Hello community,
I am planning to implement the automatic differentiation functionality on code which I have developed in Fortran. What are your opinions regarding the numerous packages listed on http://www.autodiff.org? Does anybody have good experience with one or anothe and would recommend specific package?

Thank you
Kind regards

2 Likes

I have had a similar question and need for years. Alas, I have not succeeded in my search. There is only one Fortran AD tool listed on the website that you mentioned, and to my knowledge, it is commercial. I have been thinking of starting an open-source project on implementing AD for a while but that has not materialized yet. If you are successful in your search for an AD tool in Fortran, please let us know here. Thanks!

I have used DNAD succesfuly in the past to compute Jacobian matrices. Here is the link to a paper with the description. I believe a short introduction can also be found in the PhD thesis of Joshua Hodson:

Numerical Analysis and Spanwise Shape Optimization for Finite Wings of Arbitrary Aspect Ratio, Joshua D. Hodson, Utah State University, 2019

A simple usage example is given in the thesis:

program Circle
#ifdef dnad
    use dnadmod
#define real type(dual)
#endif
    implicit none
    REAL, parameter :: pi=3.1416
    real :: r,a
    write(*,*) "Enter a radius: "
    read(*,*) r
    a = pi*r**2
    write(*,*) "Area = ",a
end program Circle

The program is then compiled using gfortran –Ddnad –Dndv=1 dnad.F90 Circle.F90, which replaces the lower-case real variables with a variables of type(dual). Running the program produces the output:

$ ./a.out
Enter a radius
5 1
Area =   78.5400009   31.4159985

We see that on printing the area, the program also prints the derivative vector, which in this example is equal to circumreference, \partial A/\partial r = 2 \pi r. In my application - evaluating small Jacobian matrices of size 9 x 9, the DNAD code proved just as fast as hand-coded Jacobians.

One downside is that the way the code is structured currently requires the user to specify the number of design variables at compile time, meaning it cannot be integrated at multiple places where a different number of design variables would be needed. I’ve been thinking about doing some experiments to combine DNAD with the fypp preprocessor to make it more flexible.

A second code I’ve experimented with is Tapenade, which is an on-line Automatic Differentiation Engine. It can also be installed locally as a set of Java classes. This then allows you to integrate it in a Makefile (I haven’t tried it). Compared to DNAD, which only implements forward mode automatic differentiation (using operator overloading), Tapenade also supports reverse mode differentation. There are however some limitations with respect to which language features are supported for differentation. From scrolling through the documentation I believe that if you stick to “procedural-style” with subroutines and functions operating on the default Fortran types (reals and integers) it should work quite well.

One of the hurdles I’ve met is how to differentiate code which relies upon BLAS or LAPACK. In principle I believe it should be possible to download the reference libraries from netlib, and replace all real variables with type(dual) and use DNAD, or run BLAS and LAPACK through Tapenade, but it feels kind of daunting. Recently, a set of reverse mode differentiated BLAS routines was published in TOMS, but I haven’t found time to dig deeper.

6 Likes

While the list of Fortran AD packages on autodiff.org is indeed large, I believe that most of those are “dead” tools.

The biggest users of AD currently seem to be the machine learning frameworks including TensorFlow, MXNet, PyTorch, and Zygote. I found the overview given in the paper Automatic differentiation in ML: Where we are and where we should be going to be quite informative.

The NAG AD Suite, that I believe @shahmoradi has in mind as a commercial solution, seems to have a strong focus on C++. In fact the NAG website says:

For simulation programs written in Fortran a version of the NAG Fortran Compiler has been extended to serve as a pre-processor to dco.

where dco/c++ is the tool for automatic differentiation.

2 Likes

Also, this paper was very readable & informative to me (to get the basic understanding of what is forward/reverse AD modes etc…)


Now I’ve been learning how to use AD from time to time, and it seems very useful indeed (I was amazed that the differentiation of log det(A) etc can be done very straightforwardly (e.g. by PyTorch), though it may be not surprising for people familiar with AD.)
2 Likes

For some of my codes, some AD tools were needed. Therefore, years ago, I’ve developed part of those tools for a specific application. But more recently, I’ve started to set up some AD fortran library:

“github.com/lauvergn/QuantumModelLib/tree/OOP_branch/SRC/dnSLib”

Right now, this library is working for scalar multivariate functions and up to the third derivatives. It is based on overloading the operators (±/= …) and all intrinsic fortran functions (sin, cos, exp …).
The library is still part of a code, but it should be relatively easy to extract.

Another library is dealing with matrix, but it is less complet.

3 Likes

For some matrix/vector operations (addition, multiplication …), it will work, although, there are some more efficient procedures. However, a diagonalization is more tricky due to the iterative procedure. Anyway, they are procedures to get derivatives of the eigenvalues and eigenvectors.

@shahmoradi, first of all, welcome to the community!

If you are interested, I am happy to write a prototype in LFortran, we have an issue open for it:

Let me know if you would be interested in working on this together. If so, let’s brainstorm how it should work.

4 Likes

I’ll be happy to be part of the discussion about AD in particular on adding a new type and overloading operator and intrinsic functions. However, I’m not sure about the meaning of ASR!

1 Like

Hi @lanast,

I have been through many of the Fortran options at http://www.autodiff.org, and as others have mentioned, the vast majority of them are no longer available or are limited to old subsets of Fortran.

May I ask which form of AD you require, Forward or Backward? For Forward mode, there are several OO libraries available, such as in flibs.

The only option for Backward mode that I am aware of is Tapenade which I can recommend highly. It is robust and works well with a wide-subset of modern Fortran. I can confirm that it can be integrated with a makefile quite nicely when installed locally. Moreover, the source-code transformation approach to AD will always provide the best performance since you can exploit compiler optimization of derivative code. Tapenade used to be closed-source and licensed for commercial use, but it now looks like it’s open source under MIT which is brilliant!


May I ask which BLAS/LAPACK operations you need derivatives for @ivanpribec? My understanding of this is that it isn’t usually a good idea to differentiate such routines, rather it is better to use the normal BLAS/LAPACK operations to implement the analytic derivative. I would recommend the following report on this topic: An extended collection of matrix derivative results for forward and reverse mode algorithmic differentiation. I’ve applied this successfully in a simple MATLAB potential flow solver to get reverse mode derivatives from the matrix inverse op.

2 Likes

I’m working on a dryer optimization problem. To calculate the objective function I need to solve a 1D diffusion equation discretized with finite differences in space and the Crank-Nicholson method in time. This leads to a tridiagonal matrix problem which I solve using dgtsv. Right now I’m using a derivative-free optimization code, but I’m not happy with the convergence rate. Thank you for the recommendation.

I see, so this is an adjoint problem then. I think you just need to solve the transpose system to back-propagate your sensitivities (see sec 2.3.1 in above referece); this will also be tridiagonal so you can still use dgtsv for the backward problem.

1 Like

Perfect. @gardhor, will you be able to join our monthly Fortran call?

Fortran Monthly Call: October 2020

If so, let’s discuss it there.

When the October fortran call is due?

It will be in about 45 min if you can make it, see the link above for connection information.

I first wrote an ad implementation about 15 years ago. In the recent lockdown I decided to re-implement it to be more extensible to higher derivatives. Currently it supports up to and including 4th:

my autodiff

3 Likes

@simong excellent, thanks for posting it!

We discussed this at our latest call, summary is here: Fortran Monthly Call: October 2020. Can you please also post your code to this issue:

That way we can include it at the new “autodiff” section at our website that we just agreed at the call to create.

1 Like

Have just posted as requested.

2 Likes

Thanks, @certik for the link. This would be great! Aside from isolate usages, AD is absolutely required to popularize Fortran in the Machine Learning community.

2 Likes

I’ve just added an automatic differentiation library on githhub.

AD_dnSVM

It uses forward mode up to the third derivative and as many independent variable.

5 Likes