
I'll try to clarify the question. Sure I can traverse the stream items
with function like `a -> m b` applying it with `mapM`. But in this case
I will not have access to return of previous stream node, which I'm
planning to use as global state for whole pipe workflow:
.--state-------+--state'--------+--state''-->
| | |
[e0..eN] ==> [e0'...eN'] ==> [e0''..eN''] =====>
This "state" will be used for statistics, errors, whatever - through
the whold workflow. This "pipe" will iterate over `eN` items (which
will be streams too), concatenates results... How this can be achieved
with Streaming library? I mean each "node" should have access to stream
items but to "global" state (result of prev. node return?) too.
Is it possible?
В Sat, 27 May 2017 18:30:28 +0300
aquagnu
I'm trying to start working with Streaming package. First attempt is to simulate source of stream items (I call the function "gen") and some processor ("proc"). Both should be stateful: to be able to sabe some info about processing steps, etc.
... import Streaming import qualified Streaming.Prelude as S ...
gen :: S.Stream (S.Of Int) IO [String] gen = do S.yield 1000 S.yield 2000 x <- lift getLine return ["a", "b", "c", x] -- results
proc :: S.Stream (S.Of Int) IO [String] -> S.Stream (S.Of Int) IO [String] proc str = do e <- str lift $ print "Enter x:" x <- lift getLine return $ e ++ [" -- " ++ x] -- put stream items in result
main :: IO main = do s <- S.mapM_ print $ S.map show gen p <- S.mapM_ print $ proc gen putStr "s: " >> print s putStr "p: " >> print p
And I try to simulate "piping" between "gen" and "proc", seems that function application is enought to compose producers and consumers. But this snippet is not correct: "e <- str" extracts element not from stream, but from results (of "gen"). Even more, I don't know how to "yield" new items, based on stream items. Something like "await" of Conduit, or like in Python "for e in str: ... yield modified(e)...". Is it possible to do it with "do" notation?
--- Best, Paul
-- Best regards, Paul a.k.a. 6apcyk