
I'm looking at iteratee as a way to replace my erroneous and really inefficient lazy-IO-based backend for an expect like Monad DSL I've been working for about 6 months or so now on and off.
The problem is I want something like:
expect "some String" send "some response"
to block or perhaps timeout, depending on the environment, looking for "some String" on an input Handle, and it appears that iteratee works in a very fixed block size.
Actually, it doesn't. It works with what enumerator gives him. In case of `enum_fd'[1] this is a fixed block, but generally this is a ``value'' of some ``collection''[2]. And it is up to programmer to decide of what should become a value. [1] http://okmij.org/ftp/Haskell/Iteratee/IterateeM.hs [2] http://okmij.org/ftp/papers/LL3-collections-enumerators.txt
While a fixed block size is ok, if I can put back unused bytes into the enumerator somehow (I may need to put a LOT back in some cases, but in the common case I will not need to put any back as most expect-like scripts typically catch the last few bytes of data sent before the peer is blocked waiting for a response...)
I don't quite get this ``last few bytes'' thing. Could you explain? I was about writing that there is no problem with putting data back to Stream, and referring to head/peek functions... But then I thought, that the ``not consuming bytes from stream'' approach may not work well in cases, when the number of bytes needed (by your function to accept/reject some rule) exceeds the size of underlying memory buffer (4K in current version of `iteratee' library[3]). [3] http://hackage.haskell.org/packages/archive/iteratee/0.3.4/doc/html/src/Data... Do you think that abstracting to the level of _tokens_ - instead of bytes - could help here? (Think of flex and bison.) You know, these enumerators/iteratees things can be layered into _enumeratees_[1][4]... It's just an idea. [4] http://ianen.org/articles/understanding-iteratees/
Otherwise, I'm going to want to roll my own iteratee style library where I have to say "NotDone howMuchMoreIThinkINeed" so I don't over consume the input stream.
What's the problem with over-consuming a stream? In your case? BTW, this `NotDone' is just a ``control message'' to the chunk producer (an enumerator): IE_cont k (Just (GimmeThatManyBytes n))
Does that even make any sense? I'm kind of brainstorming in this email unfortunately :-)
What's the problem with brainstorming? :) Cheers. -- vvv