Casting enum which is a part of struct to bits and using it in expression causes internal error in DSLX test
Closed this issue · 2 comments
Describe the bug
When there is an enum field in struct cast to bits and used in expression in DSLX test, the internal error is raised.
To Reproduce
- Write following code:
enum FOO_ENUM : u2 {
A = 0,
B = 1,
C = 2,
D = 3,
}
struct FooState {
foo: FOO_ENUM,
a: u32
}
#[test_proc]
proc FooTest {
terminator: chan<bool> out;
init { zero!<FooState>() }
config(terminator: chan<bool> out) {
(terminator,)
}
next(state: FooState) {
let a = (state.foo as u32) + u32:1;
// any of the following works:
// let a = (state.foo as u2 as u32) + u32:1;
// let a = (state.foo as u32 as u32) + u32:1;
// let a = (FOO_ENUM::A as u32) + u32:1;
// let a = (state.foo as u32) / u32:3;
// let a = state.foo as u32;
// let a = (state.foo as u32) << u32:1;
// let a = u32:1 << (state.foo as u32);
send(join(), terminator, true);
FooState { a: a, ..state}
}
}
- Create DSLX test target and run it
xls_dslx_library(
name = "foo_dslx",
srcs = ["foo.x"],
)
xls_dslx_test(
name = "foo_dslx_test",
library = ":foo_dslx",
)
bazel run -- //:foo_dslx_test --logtostderr
- Observe the error
Executing tests from //:foo_dslx_test
-----------------------------------------------------------------------------
[ RUN UNITTEST ] FooTest
E0812 07:17:03.566510 417226 interp_value.cc:506] INTERNAL: XLS_RET_CHECK failure (xls/dslx/interp_value.cc:506) GetBitCount().value() == other.GetBitCount().value() (2 vs. 32)
0x5888b513f26a: xabsl::StatusBuilder::CreateStatusAndConditionallyLog()
0x5888b3200769: absl::lts_20240116::StatusOr<>::StatusOr<>()
0x5888b36404da: xls::dslx::InterpValue::Add()
0x5888b32abb07: std::__1::__function::__func<>::operator()()
0x5888b329c9a7: xls::dslx::BytecodeInterpreter::EvalBinop()
0x5888b328b959: xls::dslx::BytecodeInterpreter::EvalAdd()
0x5888b3289c81: xls::dslx::BytecodeInterpreter::EvalNextInstruction()
0x5888b3288814: xls::dslx::BytecodeInterpreter::Run()
0x5888b32a7ddf: xls::dslx::ProcInstance::Run()
0x5888b317126f: xls::dslx::ParseAndTest()
0x5888b31115f5: main
0x7bf42e629d90: [unknown]
E0812 07:17:03.566568 417226 run_routines.cc:107] Internal error: INTERNAL: XLS_RET_CHECK failure (xls/dslx/interp_value.cc:506) GetBitCount().value() == other.GetBitCount().value() (2 vs. 32)
[ FAILED ] FooTest: internal error: INTERNAL: XLS_RET_CHECK failure (xls/dslx/interp_value.cc:506) GetBitCount().value() == other.GetBitCount().value() (2 vs. 32)
[===============] 1 test(s) ran; 1 failed; 0 skipped.
Expected behavior
No error is raised.
Note
For multiplication and subtraction the error is different:
E0812 07:17:38.682057 417582 run_routines.cc:107] Internal error: INVALID_ARGUMENT: Cannot mul different width values: lhs 2 bits, rhs 32 bits
E0812 07:18:00.680689 417934 run_routines.cc:107] Internal error: INVALID_ARGUMENT: Interpreter value sub requires lhs and rhs to have same bit count; got 2 vs 32
Environment (this can be helpful for troubleshooting):
- OS: Arch Linux x86_64
- Kernel version: 6.10.3-arch1-2
- XLS hash:
3aadb2f6c414c8958bac5f6bef005168321d8435
Consider the following:
enum MyEnum : s3 {
VAL_0 = 0,
}
struct EnumInStruct {
e: MyEnum,
}
fn cast_enum_to_bits() -> s32 {
let foo = EnumInStruct { e: MyEnum::VAL_0 };
let a = (foo.e as s32) - s32:1;
a
}
The bytecode interpreter creates foo as a tuple of u32:0 -- already evaluating the enum.
xls/xls/dslx/bytecode/bytecode_emitter.cc
Line 1460 in aae7ee2
bytecode_interpreter.cc:137] Bytecode: literal s3:0 @ test.x:10:38-10:45
bytecode_interpreter.cc:138] PC: 0 : literal s3:0 @ test.x:10:38-10:45
bytecode_interpreter.cc:140] - stack depth 0 []
interpreter_stack.h:75] PushFormattedValue(s3:0)
bytecode_interpreter.cc:144] - stack depth 1 [s3:0]
bytecode_interpreter.cc:137] Bytecode: create_tuple 1 @ test.x:10:26-10:48
bytecode_interpreter.cc:138] PC: 0x1 : create_tuple 1 @ test.x:10:26-10:48
bytecode_interpreter.cc:140] - stack depth 1 [s3:0]
interpreter_stack.h:70] Push((s3:0))
bytecode_interpreter.cc:144] - stack depth 1 [(s3:0)]
bytecode_interpreter.cc:137] Bytecode: store 0 @ test.x:10:7-10:10
And there is no issue with the interpretation.
In contrast use of the ZeroMacro with the following
fn cast_enum_to_bits() -> s32 {
let foo = zero!<EnumInStruct>();
let a = (foo.e as s32) - s32:1;
a
}
Results in the error in the issue because the literal created of of enum type:
typecheck_function.cc:137] Typechecking fn: cast_enum_to_bits
constexpr_evaluator.cc:240] ConstexprEvaluator::HandleBinop : (foo.e as s32) - s32:1
constexpr_evaluator.cc:278] ConstexprEvaluator::HandleCast : (foo.e as s32)
bytecode_interpreter.cc:137] Bytecode: literal (MyEnum:0) @ test.x:11:15-11:17
bytecode_interpreter.cc:138] PC: 0 : literal (MyEnum:0) @ test.x:11:15-11:17
bytecode_interpreter.cc:140] - stack depth 0 []
interpreter_stack.h:75] PushFormattedValue((MyEnum:0))
bytecode_interpreter.cc:144] - stack depth 1 [(MyEnum:0)]
There are two solutions to this
- Literals in the bytecode intepreter should be consistently created regardless if if they came from a ZeroMacro or not
Ref:
xls/xls/dslx/bytecode/bytecode_emitter.cc
Line 950 in aae7ee2
- Regarless of the above, casts with enum should correctly extend their sizes
xls/xls/dslx/bytecode/bytecode_interpreter.cc
Line 576 in aae7ee2