Big enough machine with enough memory and it's fine. I used to keep a job queue with a billion rows on MySQL at a gig long ago. Could do it with PostgreSQL pretty easily too. On your personal work machine? I dunno.
Not trying to steer you away from using Haskell here by any means, but if you can process your data in a SQL database efficiently, that's often pretty optimal in terms of speed and ease of use until you start doing more sophisticated analysis. I don't have a lot of experience in data analysis but I knew people to do some preliminary slicing/dicing in SQL before moving onto a building a custom model for understanding the data.