elnewfie/lslforge

LSLforge improperly handling hexadecimal integers

Opened this issue · 3 comments

Given a hexadecimal value in your LSL scripts, LSLforge will automatically 
convert it to an integer value.  However, this is bugging up some code I was 
testing.

Namely, the hex value 0x80000000 is being converted to 2147483648, but when 
entered into a script in SL or InWorldz, such as 
llOwnerSay((string)0x80000000); the integer value reported is -2147483648.

This is with the 64bit Linux version of LSLforge.  A friend of mine suspects 
that LSLforge (or maybe Eclipse) is converting it to a 64bit integer, when it 
should be 32bit.

Original issue reported on code.google.com by ZauberEx...@gmail.com on 29 Apr 2014 at 5:45

  In my environment, it makes -2147483648 because my CPU is still 32bit...
Line 62 of Type.hs,

data LSLValue a = IVal Int | FVal a | SVal String | VVal a a a 

  I guess you can fix it to replace Int to Int32, but I can't confirm it...

Original comment by pells...@gmail.com on 10 May 2014 at 2:55

no issue on Win7-64 with
Version: Kepler Service Release 2
Build id: 20140224-0627

java.version=1.6.0_45
java.vm.info=mixed mode
java.vm.name=Java HotSpot(TM) 64-Bit Server VM
java.vm.specification.name=Java Virtual Machine Specification
java.vm.specification.vendor=Sun Microsystems Inc.
java.vm.specification.version=1.0
java.vm.vendor=Sun Microsystems Inc.
java.vm.version=20.45-b01


using 32bit LSLForge win executable from my own repo 
https://github.com/RayZopf/LSLForge_patched/blob/master/lslforge/haskell/dist/bu
ild/LSLForge/LSLForge.exe
compiled with 32bit GHC, http://www.haskell.org/ghc/download_ghc_6_10_4#windows


Original comment by sl-z...@postman.homeip.net on 16 May 2014 at 8:16

Attachments:

Yeah, it's not going to show up on 32bit builds, even if you are running them 
on a 64bit system, because they still execute in 32bit mode.  It'll only show 
up if you use a 64bit LSLforge build on a 64bit system.

Original comment by ZauberEx...@gmail.com on 16 May 2014 at 8:21