Using the LuaJIT extension syntax for ULL numbers causes a parse error (despite already being supported?)
Duckwhale opened this issue · 1 comments
This is a follow-up (of sorts) to Kampfkarren/full-moon#255
As I was looking into adopting selene for use with my project, I discovered that it fails to parse the following LuaJIT-supported syntax:
local wgpu = {
WGPU_ARRAY_LAYER_COUNT_UNDEFINED = 0xffffffffULL,
WGPU_COPY_STRIDE_UNDEFINED = 0xffffffffULL,
WGPU_LIMIT_U32_UNDEFINED = 0xffffffffULL,
WGPU_LIMIT_U64_UNDEFINED = 0xffffffffffffffffULL,
-- ... rest omitted for brevity
}
Running selene
on this with base: lua51
in the config fails with a parser error:
error[parse_error]: unexpected token `{`
┌─ Runtime/Bindings/webgpu.lua:3:14
│
3 │ local wgpu = {
│ ^ expected expression
I've tried changing the base to lua51+lua52
and just lua52
as well - to no effect. This was using the latest git checkout of selene.
It seems that selene
uses a recent version of the parser (according to the cargo file), which should include support for the extension. Is there anything in particular that I need to do in order to enable this feature? The PR mentioned lua52
so that's what I tried. FWIW, StyLua seems to take no issue with the code and it's using the same parser, so I'm a bit confused as to why selene choked on the file :)
Related to #224. A quick workaround would be to fork selene with full-moon's lua52 and build locally.