Different results between int and String
Opened this issue · 1 comments
Sulley93 commented
If I call md5 with an integer parameter it return a different value from md5 called with a string parameter.
e.g.
md5(1003) = 64e0407ffc70c04a366526a7065cbd05
md5('1003') = aa68c75c4a77c87f97fb686b2f068676.
Is it a normal behavior?
pvorb commented
Yes, this is "normal". Don't use this library. It will happily help you shoot yourself in the foot.
MD5 is an algorithm that is defined on binary input data. This library just does whatever your runtime will do to make your input binary. So, if your runtime will serialize strings in UTF-16 rather than UTF-8, you'd get different results for the same input string.