
12 Nov
2014
12 Nov
'14
6:56 a.m.
On 12/11/14 05:21, Peter Simons wrote:
If you'd want to analyze 12 GB of data in Haskell, you'd have to jump through all kinds of hoops just to load that CVS file into memory. It's possible, no doubt, but pulling it off efficiently requires a lot of expertise in Haskell that statistics guys don't necessarily have (and arguably they shouldn't have to).
Well, with Haskell you don't have to load the whole data set into memory, as Michael shows. With R, on the other hand, you do. Besides, if you're not an R expert, and if the analysis you want to do is not readily available, it may be quite a pain to implement in R. As a simple example, I still don't know an acceptable way to write something like zipWith f (tail vec) vec in R. Roman