I remember reading somewhere that the standard only requires default and one double default kind precision for
real type (but today after reading a page from Metcalf’s book, I am not even sure if this is correct). Assuming this is the case for
real type, is there also a requirement for supporting an integer of default kind and at least an integer of a higher kind than the default?
If so, what would be the easiest way to infer the kind value for the one higher than the default kind (e.g.,
int64 given default