Automatic differentiation of Fortran code, opinions?

FortranCalculus (FC) compiler is available and free. Download it at my website, goal-driven.net . Its roots start at NASA’s Apollo Space program that got us to the moon and back. The first language, developed around 1960 by TRW, was named Slang. It has 10+ solvers stored in a library that a user can call by name. These solvers used Automatic Differentiation (AD) to calculate the Jacobian and hessian matrices. For more, please read my profile.

I’m 77 years old and ready to retire. Anyone interested to taking on my website? My website has some 5+ Apps showing the power of FC and the FC compiler & manuals. The average download customer downloads ~4 Apps per day and the number of customers over the last year (2022) ranges from 50 to 90 downloads per customer! China is the number one country of interest. They are from 5 to 30 times the number of USA customers. They must be involved in Optimization.

2 Likes

There have been 3 AD based compilers that I’m a where of. NASA’s Apollo Space program had need for faster compilers as their engineers kept changing their minds (see the movie hidden figures?). So NASA hired TRW to write a new type compiler where solvers were stored in a library and required no development. Around 1960, TRW developed the first AD based compiler, Slang, that dropped solving problems to days instead of years. The first commercial AD compiler, PROSE, was made available in 1974 on CDC computers. Today there is a version named FortranCalculus that became available around 1990. They all 3 compilers handle Optimization problem nicely.

1 Like

Slang, PROSE, and FortranCalculus are AD based compilers.

1 Like

Hi Phil,

welcome to Discourse! I couldn’t find anything about AD-supporting compilers on the NASA Technical Reports Server (https://ntrs.nasa.gov/). Who do the initials TRW stand for?

Have you compared FortranCalculus with any AD frameworks in languages that are popular today, like Python, MATLAB or Julia? Do you still operate a company based around your (extended) Fortran compiler?

1 Like

The only option for Backward mode that I am aware of is Tapenade which I can recommend highly.

I see from the article (at arxiv: https://arxiv.org/pdf/1502.05767.pdf), page 24 of 43, the ADIFOR and NAGWare also work in reverse mode. In addition, a complete list of AD implementation in different languages is given.

1 Like

TRW = Thomson Ramo Wooldridge. Wikipedia will tell you more about it.

2 Likes

Thank you, I’ve found a paper from them:

The Q approach to problem solving (1969) | ACM

According to Google Scholar this paper only has 3 citations! Quite a shame. Too many new ideas at once.

1 Like

Hi @ivanpribec, TRW Systems, Inc. I believe its the full name, but sometimes I see it as just TRW, Inc. I believe the main developer was Joseph Thames who lived in the L.A., CA area, if that helps. Joe gave a SIAM conference presentation on PROSE in 1974 in Palo Alto (?), CA. The conference Heads couldn’t believe what he was saying and wouldn’t publish his presentation. Joe sent me a copy, but not sure where it is.

Have you compared FortranCalculus with any AD frameworks in languages that are popular today, like Python, MATLAB or Julia?

No, we are speaking about different worlds. FC / PROSE solve problems that the others have never though of. For example, nesting of solvers, implicit problems, etc. see my casebook for more. Most problems are solved in less than 50 lines of code. Try it, you’ll love it. Here is my Computer-Aided Designs presentation to High School students that may give you an idea about the FC language. It’s about 30 minutes long with 8 minutes of questions.

Do you still operate a company based around your (extended) Fortran compiler?

My company, Optimal Designs Enterprise (ODE), has just me. It took me some 22 years just to write my casebook, solve each problem, document code & solution. So, today I volunteer my time with middle & high school students via zoom who are thinking about a STEM career in industry … visit nepris.com and become a volunteer!

I do have one contract job I’m still thinking about. It’s regarding Oil Refiners and process control. Know anyone in such arenas? The problem is huge and may require building larger computers … about 10 times today’s computers; faster and more memory. It involves solving around 400 PDEs at the same time … nesting solvers. The problem may require about 20 hours per run, but must be done under 6 hours. Any interest? It will increase an Oil Refinery’s productivity by 10+%.

I think automatic differentiation should be one of the fundamentals of a heavy numeric language as Fortran. Could this be implemented in stdlib? @gardhor project seems like one of the more complete and usable projects for what I have seen (also some other dual numbers modules). But I think autodiff deserves a full representation in stdlib including both forward and backward modes

3 Likes

Thanks @fedebenelli for your nice comment !
Porting an existing AD library into the Fortran stdlib should be relatively easy, although it will be a lot of works. Personally, I don’t known how to start.
Right now, my library (AD_dnSVM) can be used only with forward mode.

1 Like

I have made a fork of Joshua Hodson’s DNAD where I have refactored it into a set of fypp macros/templates. I have also extended it to support hyper-dual numbers for computing Hessians, currently only for a small subset of the functions/oparators that the normal dual number type supports. The approach here is forward mode autodiff via operator overloading.

Using meta-programming for this has some advantages:

  • You can create several dual and/or hyper-dual types and give them names reflecting the design variables that their derivative is taken with respect to! Then the compiler will arrest you if you try to use them later in incompatible ways.
  • You can “inject” the function overloads in the same module where they are used to have them inlined without interprocedural optimization.
  • You can turn on/off “stringent” evaluation at the code generation level.

I am currently generating the code directly into the example, src and test folders and incuded them in the repo, so the examples and tests can be built and run directly (for instance fpm run --example main_example_hdual or fpm test) without having to run the code generator (fypp) first.

test/test_hdual_chain_mod.f90 (generated from test/test_hdual_chain_mod.fypp) shows an example of using two different dual number types together, one with design vector x and another with design vector y. A variable fy of type hdual_y_t can be converted into fx of type hdual_x_t by calling fx = chain_duals(fy, yx) where yx has type hdual_x_t and represents the y-design-vector.

Currently, the size of the design vector(s) need to be known at compile time. It should’nt be too difficult to lift this restriction and allow allocatable gradients and hessians. This is probably something that should be controlled at the code-generation level as well!

3 Likes