Can pipes solve this problem? How?

Consider code, that takes input from handle until special substring matched:
matchInf a res s | a `isPrefixOf` s = reverse res matchInf a res (c:cs) = matchInf a (c:res) cs hTakeWhileNotFound str hdl = hGetContents hdl >>= return.matchInf str []
It is simple, but the handle is closed after running. That is not good, because I want to reuse this function.
This example is part of one of Iteratee demonstrations http://okmij.org/ftp/Haskell/Iteratee/IterDemo1.hs Please search for -- Early termination: -- Counting the occurrences of the word ``the'' and the white space -- up to the occurrence of the terminating string ``the end'' The iteratee solution is a bit more general because it creates an inner stream with the part of the outer stream until the match is found. Here is a sample application: run_bterm2I fname = print =<< run =<< enum_file fname .| take_until_match "the end" (countWS_iter `en_pair` countTHE_iter) It reads the file until "the end" is found, and counts white space and occurrences of a specific word, in parallel. All this processing happens in constant space and we never need to accumulate anything into string. If you do need to accumulate into string, there is an iteratee stream2list that does that. The enumeratee take_until_match, as take and take_while, stops when the terminating condition is satisfied or when EOF is detected. In the former case, the stream may contain more data and remains usable. A part of IterDemo1 is explained in the paper http://okmij.org/ftp/Haskell/Iteratee/describe.pdf I am not sure though if I answered your question since you were looking for pipes. I wouldn't call Iteratee pipes.

Over the years we have been constructing a collection of Embedded Domain Specific Languages for describing compilers which are assembled from fragments which can be compiled individuallu. In this way one can gradually ``grow a langauge'' in a large number of small steps. The technique replaces things like macro extensions or Template Haskell; it has become feasable to just extend the language at hand by providing extra modules. The nice thing is that existing code does not have to be adapted, nor has to be available nor has to be recompiled. Recently we have been using (and adapting) the frameworks such that we could create an entry in the ldta11 (http://ldta.info/tool.html) tool challenge, where one has to show how one's tools can be used to create a compiler for the Oberon0 language, which is used a a running example in Wirth's compiler construction book. We have uploaded our implementation to hackage at: http://hackage.haskell.org/package/oberon0. More information can be found at the wiki: http://www.cs.uu.nl/wiki/bin/view/Center/CoCoCo You may take a look at the various Gram modules to see how syntax is being defined, and at the various Sem modules to see how we use our first class attribute grammars to implement the static semantics associated with the various tasks of the challenge. We hope you like it, and comments are welcome, Marcos Viera Doaitse Swierstra
participants (2)
-
Doaitse Swierstra
-
oleg@okmij.org