large integer types should also be typed as numbers
Opened this issue · 7 comments
currently, i64
, u64
, i128
, and u128
are defined to export to bigint
, such that:
#[derive(TS)]
pub struct Foo {
foo: u64,
}
creates a TS declaration like:
export interface Foo { foo: bigint }
However, foo
may be a number
instead of a bigint
eg if using JSON.parse
, so the interface should actually be:
export interface Foo { foo: bigint | number }
Worth noting that wasm-bindgen converts u64 arguments/returns to BigInt due to the limitations of JS numbers and the WASM spec around larger integers
However, foo may be a number instead of a bigint eg if using JSON.parse, so the interface should actually be:
It's worth noting that if you have a i64/u64/etc. and deserialize it to a number in javascript you may get subtle errors as a number
cannot represent everything in a 64 bit integer. I think bigint is the more correct option here.
I understand that 64+-bit numbers are usually converted to bigint when sent to JS across the WASM boundary, but when serializing using eg serde_json
, the value is not specified as a bigint (the number isn't written as 5n
, it's written as 5
so JSON.parse
selects number
for the deserialized value).
So when using ts-rs
for eg an HTTP API, the generated typescript bindings are wrong - it assumes that the value is always a bigint, when in fact it can be a bigint or a number. Since JS doesn't automatically coerce between the 2 types and they have different properties from each other (eg one is safe to use in cryptography while the other isn't), it should be up to the consumer of the API to decide whether the value should be treated as a bigint or a number.
Sending an u64 over json to javascript is not valid in general. Take a look at this for example:
Therefore, I would say that the only good use case for sending an u64 to javascript is via a path that makes it a bigint, not a number.
Imo, making an u64 be implicitly converted to a number in a typescript definition makes it possible for obscure bugs to appear without you noticing. I would recommend only sending i32, u32, f32 or f64 to javascript in terms of numbers, since larger number types cannot be represented.
First, I understand the reasoning behind being dogmatic about u64 to bigint, and I can see it being the default.
However, I think there are some valid use cases where it is acceptable to assume u64 as a JS number, as 2^53 can be practically unreachable for many use cases.
Having something like a #[ts_as_number] would avoid unnecessary JSON/TS bigint gymnastic or be stuck below the u32 limit.
2022-09-08 - Apologies! I did not realize we could annotate a property with #[ts(type = "number")]
. I should have read the doc fully before commenting. This is perfect. I think the default is right; soundness should not be an opt-in. So, all good. Thanks for this amazing library!
It would be nice if attributes could be added to deal with this, instead of manually annotating with #[ts(type = "{...}")]
, as the struct can be big
Personally, I see the value in bigint | number
as I and many others use json-bigint, which deserializes a number as bigint
only if it's too big, thus bigint | number
edit: yes I know alwaysParseAsBig
exists for json-bigint
but it deserializes everything as bigint
, which most of the time is not what I want
Hey @NyxCode! What's your opinion on this whole issue?