
26 Mar
2011
26 Mar
'11
1:20 p.m.
The "base" library has the "threadDelay" primitive, which takes an Int argument in microseconds. Unfortunately this means that the longest delay you can get on a 32 bit machine with GHC is just under 36 minutes (2^31 uSec), and a hypothetical compiler that only used 30 bit integers (as per the standard) would get under 10 minutes. It is a bit tricky to write general-purpose libraries with this. I think that there should be a type Delay = Int64 declaration, and that threadDelay and related functions should take that as an argument type. Paul.