[REQUEST] C-styled int/real literal distinction
nazjun opened this issue · 1 comments
nazjun commented
I requested this informally before, but I would still love to see this as a feature once compatibility is no longer an issue (and personally, I don't expect it to be that big a deal in the end).
Basically, do away with the i
suffix and have the presence of a decimal point be what distinguishes integers from real numbers.
This would also encompass all hexadecimal and binary literals, which... don't really have a reason not to be int types?
Natashi commented
Added support for explicit type declarations with commit 03c028c