
On 12.11.2014 12:56, Roman Cheplyaka wrote:
On 12/11/14 05:21, Peter Simons wrote:
If you'd want to analyze 12 GB of data in Haskell, you'd have to jump through all kinds of hoops just to load that CVS file into memory. It's possible, no doubt, but pulling it off efficiently requires a lot of expertise in Haskell that statistics guys don't necessarily have (and arguably they shouldn't have to). Well, with Haskell you don't have to load the whole data set into memory, as Michael shows. With R, on the other hand, you do.
That is exactly the thing that came to my mind thinking about R. I haven't actually used R myself but based on what I know and what some googling revealed all analysis would have to happen in-memory. PS: I could be wrong of course ;)