Automatic differentiation of Fortran code, opinions?

I’ll be happy to be part of the discussion about AD in particular on adding a new type and overloading operator and intrinsic functions. However, I’m not sure about the meaning of ASR!

1 Like

Hi @lanast,

I have been through many of the Fortran options at, and as others have mentioned, the vast majority of them are no longer available or are limited to old subsets of Fortran.

May I ask which form of AD you require, Forward or Backward? For Forward mode, there are several OO libraries available, such as in flibs.

The only option for Backward mode that I am aware of is Tapenade which I can recommend highly. It is robust and works well with a wide-subset of modern Fortran. I can confirm that it can be integrated with a makefile quite nicely when installed locally. Moreover, the source-code transformation approach to AD will always provide the best performance since you can exploit compiler optimization of derivative code. Tapenade used to be closed-source and licensed for commercial use, but it now looks like it’s open source under MIT which is brilliant!

May I ask which BLAS/LAPACK operations you need derivatives for @ivanpribec? My understanding of this is that it isn’t usually a good idea to differentiate such routines, rather it is better to use the normal BLAS/LAPACK operations to implement the analytic derivative. I would recommend the following report on this topic: An extended collection of matrix derivative results for forward and reverse mode algorithmic differentiation. I’ve applied this successfully in a simple MATLAB potential flow solver to get reverse mode derivatives from the matrix inverse op.


I’m working on a dryer optimization problem. To calculate the objective function I need to solve a 1D diffusion equation discretized with finite differences in space and the Crank-Nicholson method in time. This leads to a tridiagonal matrix problem which I solve using dgtsv. Right now I’m using a derivative-free optimization code, but I’m not happy with the convergence rate. Thank you for the recommendation.

I see, so this is an adjoint problem then. I think you just need to solve the transpose system to back-propagate your sensitivities (see sec 2.3.1 in above referece); this will also be tridiagonal so you can still use dgtsv for the backward problem.

1 Like

Perfect. @gardhor, will you be able to join our monthly Fortran call?

Fortran Monthly Call: October 2020

If so, let’s discuss it there.

When the October fortran call is due?

It will be in about 45 min if you can make it, see the link above for connection information.

I first wrote an ad implementation about 15 years ago. In the recent lockdown I decided to re-implement it to be more extensible to higher derivatives. Currently it supports up to and including 4th:

my autodiff


@simong excellent, thanks for posting it!

We discussed this at our latest call, summary is here: Fortran Monthly Call: October 2020. Can you please also post your code to this issue:

That way we can include it at the new “autodiff” section at our website that we just agreed at the call to create.

1 Like

Have just posted as requested.


Thanks, @certik for the link. This would be great! Aside from isolate usages, AD is absolutely required to popularize Fortran in the Machine Learning community.


I’ve just added an automatic differentiation library on githhub.


It uses forward mode up to the third derivative and as many independent variable.


FortranCalculus (FC) compiler is available and free. Download it at my website, . Its roots start at NASA’s Apollo Space program that got us to the moon and back. The first language, developed around 1960 by TRW, was named Slang. It has 10+ solvers stored in a library that a user can call by name. These solvers used Automatic Differentiation (AD) to calculate the Jacobian and hessian matrices. For more, please read my profile.

I’m 77 years old and ready to retire. Anyone interested to taking on my website? My website has some 5+ Apps showing the power of FC and the FC compiler & manuals. The average download customer downloads ~4 Apps per day and the number of customers over the last year (2022) ranges from 50 to 90 downloads per customer! China is the number one country of interest. They are from 5 to 30 times the number of USA customers. They must be involved in Optimization.


There have been 3 AD based compilers that I’m a where of. NASA’s Apollo Space program had need for faster compilers as their engineers kept changing their minds (see the movie hidden figures?). So NASA hired TRW to write a new type compiler where solvers were stored in a library and required no development. Around 1960, TRW developed the first AD based compiler, Slang, that dropped solving problems to days instead of years. The first commercial AD compiler, PROSE, was made available in 1974 on CDC computers. Today there is a version named FortranCalculus that became available around 1990. They all 3 compilers handle Optimization problem nicely.

1 Like

Slang, PROSE, and FortranCalculus are AD based compilers.

1 Like

Hi Phil,

welcome to Discourse! I couldn’t find anything about AD-supporting compilers on the NASA Technical Reports Server ( Who do the initials TRW stand for?

Have you compared FortranCalculus with any AD frameworks in languages that are popular today, like Python, MATLAB or Julia? Do you still operate a company based around your (extended) Fortran compiler?

1 Like

The only option for Backward mode that I am aware of is Tapenade which I can recommend highly.

I see from the article (at arxiv:, page 24 of 43, the ADIFOR and NAGWare also work in reverse mode. In addition, a complete list of AD implementation in different languages is given.

1 Like

TRW = Thomson Ramo Wooldridge. Wikipedia will tell you more about it.


Thank you, I’ve found a paper from them:

The Q approach to problem solving (1969) | ACM

According to Google Scholar this paper only has 3 citations! Quite a shame. Too many new ideas at once.

1 Like

Hi @ivanpribec, TRW Systems, Inc. I believe its the full name, but sometimes I see it as just TRW, Inc. I believe the main developer was Joseph Thames who lived in the L.A., CA area, if that helps. Joe gave a SIAM conference presentation on PROSE in 1974 in Palo Alto (?), CA. The conference Heads couldn’t believe what he was saying and wouldn’t publish his presentation. Joe sent me a copy, but not sure where it is.

Have you compared FortranCalculus with any AD frameworks in languages that are popular today, like Python, MATLAB or Julia?

No, we are speaking about different worlds. FC / PROSE solve problems that the others have never though of. For example, nesting of solvers, implicit problems, etc. see my casebook for more. Most problems are solved in less than 50 lines of code. Try it, you’ll love it. Here is my Computer-Aided Designs presentation to High School students that may give you an idea about the FC language. It’s about 30 minutes long with 8 minutes of questions.

Do you still operate a company based around your (extended) Fortran compiler?

My company, Optimal Designs Enterprise (ODE), has just me. It took me some 22 years just to write my casebook, solve each problem, document code & solution. So, today I volunteer my time with middle & high school students via zoom who are thinking about a STEM career in industry … visit and become a volunteer!

I do have one contract job I’m still thinking about. It’s regarding Oil Refiners and process control. Know anyone in such arenas? The problem is huge and may require building larger computers … about 10 times today’s computers; faster and more memory. It involves solving around 400 PDEs at the same time … nesting solvers. The problem may require about 20 hours per run, but must be done under 6 hours. Any interest? It will increase an Oil Refinery’s productivity by 10+%.