We now have --infer which infers the type from the RHS:
/tmp$ lfortran --infer
Interactive Fortran. Experimental prototype, not ready for end users.
LFortran version: 0.63.0-287-g47dbbe6d8e
* Use Ctrl-D to exit
* Use Enter to submit
* Use Alt-Enter or Ctrl-N to make a new line
- Editing (Keys: Left, Right, Home, End, Backspace, Delete)
- History (Keys: Up, Down)
>>> a = 5 1,6 ]
>>> b = 6.3 1,8 ]
>>> a 1,2 ]
5
>>> b 1,2 ]
6.3000002
And currently we treat 6.3 as single precision, but possibly should switch to double. The question becomes how to handle this with existing codes, because I have a feeling it will be confusing if the interpreted behavior is different to ahead-of-time-compiled behavior. Right now LFotran’s behavior in interpreted mode (especially without --infer) is identical to how it behaves when compiling your existing code.
I want to warn by default when the precision is not correctly specified, and this is related to it. I think there is a way to achieve a robust behavior with warnings.
In LFortran, is there an option to specify the default real kind? In gfortran, one can specify that with -fdefault-real-8 (and -fdefault-double-8 to prevent the promotion of 8 byte to 16 byte reals).