calyxir/calyx

Parse error for large constants

Opened this issue · 2 comments

I'm evaluating a Calyx binding I created for posits. I was trying to test the accuracy of 128 bit posits when I noticed that Calyx can't parse literals larger than the 64 bit limit

Error

=====STDERR=====
Error: Failed to parse buffer:   --> 20:28
   |
20 |     const0 = std_const(128,82412135738664784120036037737381363712);
   |                            ^------------------------------------^
   |
   = Expected valid bitwidth

=====STDOUT=====

The large number 82412135738664784120036037737381363712 is the 128 bit uint representation of 0.5 (as 128 bit posit)

Source

fn bitwidth(input: Node) -> ParseResult<u64> {
input
.as_str()
.parse::<u64>()
.map_err(|_| input.error("Expected valid bitwidth"))
}

@rachitnigam @sampsyo I mentioned that this would be a problem in #1969. It seems like solving this would require using some sort of bigint library which would probably cause a cascade of code changes since the types would have to be updated everywhere. Probably using something like ibig or dashu would be the most straightforward approach, but I'm curious if people have thoughts. Pulling from this benchmark analysis of some bigint libraries (https://github.com/tczajka/bigint-benchmark-rs).

Just want to add that these changes will also benefit the allo -> amc -> calyx flow significantly. The previous workaround works for now, but having true >64 bit int support will allow for a lot of interesting test cases. Certainly not an easy change though so I understand the hesitation to change this now...