As a sideeffect of the fact that nagfor
is not available on CI runners, new features of nagfor
are not well tested. For example, see
Note that half-precision real has been supported at least since nagfor
7.0 released in 2021, namely three years ago. But I guess no one had ever used it in a real project before I did it with PRIMA. Otherwise, the compiler would not still contain bugs like evaluating -1.0_REAL16 * 1.0_REAL16
to NaN
or 0
after supporting REAL16
for three years.
Why no one has ever tested it before? Because people are limited by the physical machines available in their offices/labs, so that they have no incentive to do so until they need half precision is truly needed in their production.
However, if the compiler were available on CI runners in the cloud, many people would have adapted a few lines in their workflow files to test half precision, just for fun or curiosity. Then the ridiculous bugs shown above would have no chance to survive three years.