
Patai Gergely schrieb:
It is the strength of Haskell to separate everything into logical steps and let laziness do things simultaneously. Stream fusion can eliminate interim lists, and final conversion to storable vector using http://hackage.haskell.org/package/storablevector-streamfusion/ can eliminate lists at all.
But in my understanding that elimination is only possible if lists are not used as persistent containers, only to mimic control structures. Now I rely on samples being stored as lists, so I can represent looping samples with infinite lists and not worry about the wrap-around at all. So in order to have any chance for fusion I'd have to store samples as vectors and wrap them in some kind of unfold mechanism to turn them into lists that can be potentially fused away. In other words, besides a 'good consumer', I need a 'good producer' too.
Right. The conversion from storablevector to stream-fusion:Stream is such a good producer.
However, there seems to be a conflict between the nature of mixing and stream processing when it comes to efficiency. As it turns out, it's more efficient to process channels one by one within a chunk instead of producing samples one by one. It takes a lot less context switching to first generate the output of channel 1, then generate channel 2 (and simultaneously add it to the mix) and so on, than to mix sample 1 of all channels, then sample 2 etc., since we can write much tighter loops when we only deal with one channel at a time.
Yes, I would also do it this way. So in the end you will have some storablevectors as intermediate data structures.