One reason for this is that up until f90, there was only one standard integer type, and even if programmers went beyond the standard and declared types as integer*2, integer*8, and so on, there was no way to specify constants of those types. F90 introduced the KIND system, which addressed the constant notation issue, but still there was only one integer KIND required by the standard (the default). Now, (as of f2008 I think) the standard requires an integer KIND with at least 18 decimals. That could be the default KIND, but on all compilers I use, INT32 is the default (unless a compiler option changes that) and INT64 is the extended KIND that meets the 18 decimal requirement. So now there are, more or less, requirements for those two KINDS and a nice KIND system that allows interconversions and constants to be specified. I am finding more and more cases where nondefault integer KINDS are used and where I want to control exactly when the conversions are done, so I’m using constants with KIND qualifiers more now than, say, 10 or 20 years ago.
2 Likes