I'm writing a Haskell to Javascript code generator, using GHC as a library. Since Javascript doesn't have an integer type and its Number type can only represent integers up to 2⁵³ properly, I'm representing integers as Numbers, explicitly performing all arithmetic modulo 2³². This works very well with a 32 bit GHC, but rather worse with the 64 bit version.
GHC will happily coerce Int64 values to Ints and interpret Int constants as 64 bit values (0xffffffff turns into 4294967295 rather than -1, for example) and that's causing all sorts of annoying problems.
The compiler works very well for "normal" web stuff even on a 64 bit system, provided that the standard libraries are built on a 32 bit machine, but "please don't use large-ish numbers, OK?" isn't something you want to see in your compiler's manual. Some of the problems (but not all) can be alleviated by compiling with -O0, but that (unsurprisingly) produces code that's not only slow, but also way too big.
So, I need to stop GHC from assuming that Int and Int64 are equivalent. Is this even possible?
GHC.Int.Int32
? – n. 1.8e9-where's-my-share m.Int
is broken? – n. 1.8e9-where's-my-share m.Data.Int
containsInt32
(it's re-exported fromGHC.Int
in GHC but that shouldn't matter) – copumpkin