
Hello haskell-cafe, please help me with selection of proper function to use i need to run external command with parameter and get its stdout, smth like this: output <- system "cmd param" the code should be compatible with unix and windows, and it should be possible to execute scripts (so afaiu it should execute command via cmd/sh). i use ghc 6.6.1 and it will be great if this function will not require any non-bundled libs and be compatible with later ghc versions -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

Hi Bulat, You wrote:
please help me with selection of proper function to use
i need to run external command with parameter and get its stdout, smth like this:
output <- system "cmd param"
the code should be compatible with unix and windows, and it should be possible to execute scripts (so afaiu it should execute command via cmd/sh). i use ghc 6.6.1 and it will be great if this function will not require any non-bundled libs and be compatible with later ghc versions
OK, I'll bite. What's wrong with runInteractiveCommand? -Yitz

Hello Yitzchak, Thursday, December 13, 2007, 4:06:10 PM, you wrote:
OK, I'll bite. What's wrong with runInteractiveCommand?
i just never used such functionality and asked community to ensure that i don't selected anything wrong. sometime ago i've reimplemented the feature that is in standard lib (unechoed input) so i just tried to use community's wisdom this time :) so, do (stdin,stdout,stderr,ph) <- runInteractiveCommand "script param" waitForProcess ph when i should read stdout? before or after waitForProcess? -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

so,
do (stdin,stdout,stderr,ph) <- runInteractiveCommand "script param" waitForProcess ph
when i should read stdout? before or after waitForProcess?
If you are sure that your script will behave nicely (or don't care if it doesn't), how about just: (_, h, _, _) <- runInteractiveCommand "script params" output <- hGetContents h -Yitz

Sorry, folks. I guess it's back to straight Gmail for me. If anyone knows how to use Gmail IMAP with Apple Mail, please let me know off list. -Yitz

Hello Yitzchak, Thursday, December 13, 2007, 4:36:48 PM, you wrote:
waitForProcess ph
when i should read stdout? before or after waitForProcess?
If you are sure that your script will behave nicely (or don't care if it doesn't), how about just:
(_, h, _, _) <- runInteractiveCommand "script params" output <- hGetContents h
am i correctly understand that i don't need to waitForProcess and, if script behaves correctly, hGetContents will read its whole output? -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

On Thu, 13 Dec 2007, Bulat Ziganshin wrote:
Hello Yitzchak,
Thursday, December 13, 2007, 4:06:10 PM, you wrote:
OK, I'll bite. What's wrong with runInteractiveCommand?
i just never used such functionality and asked community to ensure that i don't selected anything wrong.
For older versions of GHC there was: http://hackage.haskell.org/cgi-bin/hackage-scripts/package/shell-pipe-0.1

On Thu, 2007-12-13 at 15:06 +0200, Yitzchak Gale wrote:
Hi Bulat,
You wrote:
please help me with selection of proper function to use
i need to run external command with parameter and get its stdout, smth like this:
output <- system "cmd param"
the code should be compatible with unix and windows, and it should be possible to execute scripts (so afaiu it should execute command via cmd/sh). i use ghc 6.6.1 and it will be great if this function will not require any non-bundled libs and be compatible with later ghc versions
OK, I'll bite. What's wrong with runInteractiveCommand?
It requires threads to use correctly and it is only available in GHC, not in hugs, nhc98 etc. It requires threads because you have to pull from both the stdout and stderr to prevent blocking. You could do it with non-blocking reads but not without busy-waiting. Duncan

Hello Duncan, Thursday, December 13, 2007, 4:51:20 PM, you wrote:
OK, I'll bite. What's wrong with runInteractiveCommand?
It requires threads because you have to pull from both the stdout and stderr to prevent blocking. You could do it with non-blocking reads but not without busy-waiting.
may be this will be ok (with -threaded)? (_, stdout, stderr, _) <- runInteractiveCommand "script params" forkIO (hGetContents stderr >>= evaluate.length) result <- hGetLine stdout hGetContents stdout >>= evaluate.length awkward, but still shorter than code from Cabal to Yitzchak: and you've asked what's wrong with runInteractiveCommand? :))) -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

On Thu, 2007-12-13 at 17:08 +0300, Bulat Ziganshin wrote:
Hello Duncan,
Thursday, December 13, 2007, 4:51:20 PM, you wrote:
OK, I'll bite. What's wrong with runInteractiveCommand?
It requires threads because you have to pull from both the stdout and stderr to prevent blocking. You could do it with non-blocking reads but not without busy-waiting.
may be this will be ok (with -threaded)?
No need for -threaded.
(_, stdout, stderr, _) <- runInteractiveCommand "script params" forkIO (hGetContents stderr >>= evaluate.length) result <- hGetLine stdout hGetContents stdout >>= evaluate.length
Yep, that'll work.
awkward, but still shorter than code from Cabal
The Cabal code has to work with ghc-6.2 - 6.8, hugs, nhc98 and jhc. It cannot use threads. Duncan

Hello Duncan, Thursday, December 13, 2007, 5:38:24 PM, you wrote:
Yep, that'll work.
Spencer Janssen added that i should use waitForProcess - otherwise on unix it will leave a zombie Yitzchak, Duncan, gwern and Spencer. thank you very much for all your answers, i'm sure that i'll have much longer learning curve without your help!
The Cabal code has to work with ghc-6.2 - 6.8, hugs, nhc98 and jhc. It cannot use threads.
you've complained that there is no such functionality in std libs. may be it's worth making new hackage library, smth like Process? -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

On Thu, 2007-12-13 at 15:48 +0300, Bulat Ziganshin wrote:
Hello haskell-cafe,
please help me with selection of proper function to use
i need to run external command with parameter and get its stdout, smth like this:
output <- system "cmd param"
the code should be compatible with unix and windows, and it should be possible to execute scripts (so afaiu it should execute command via cmd/sh). i use ghc 6.6.1 and it will be great if this function will not require any non-bundled libs and be compatible with later ghc versions
There is rawSystemStdout in Cabal which you might like to copy. It is portable to windows and several Haskell compilers but does use cpp to achieve that. It's also rather inefficient as it has to create a temporary file. It seems it is not possible to use pipes to get the stdout and have the resulting code be portable between Haskell implementations (it's a favourite peeve of mine). As the name suggests, rawSystemStdout uses rawSystem so does not necessarily do what you want with cmd/sh but you could easily adapt it to use system rather than rawSystem. http://darcs.haskell.org/cabal/Distribution/Simple/Utils.hs http://darcs.haskell.org/cabal/Distribution/Compat/TempFile.hs Duncan

Hello Duncan, Thursday, December 13, 2007, 4:10:26 PM, you wrote:
i need to run external command with parameter and get its stdout, smth
temporary file. It seems it is not possible to use pipes to get the stdout and have the resulting code be portable between Haskell implementations (it's a favourite peeve of mine).
i don't need to interact with program, just get its whole output (one line) after it was finished. btw, afair, there was problems in windows with redirection of cmd.exe output (in particular when running executables from command files) separate windows and unix versions are ok for me, i just need them both. and i need only ghc support taking this all into account, where i should look? -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

On Thu, 2007-12-13 at 16:27 +0300, Bulat Ziganshin wrote:
Hello Duncan,
Thursday, December 13, 2007, 4:10:26 PM, you wrote:
i need to run external command with parameter and get its stdout, smth
temporary file. It seems it is not possible to use pipes to get the stdout and have the resulting code be portable between Haskell implementations (it's a favourite peeve of mine).
i don't need to interact with program, just get its whole output (one line) after it was finished. btw, afair, there was problems in windows with redirection of cmd.exe output (in particular when running executables from command files)
separate windows and unix versions are ok for me, i just need them both. and i need only ghc support
taking this all into account, where i should look?
Use just the GHC bit from the code I pointed at: bracket (liftM2 (,) (openTempFile tmpDir "cmdstdout") (openFile devNull WriteMode)) -- We need to close tmpHandle or the file removal fails on Windows (\((tmpName, tmpHandle), nullHandle) -> do hClose tmpHandle removeFile tmpName hClose nullHandle) $ \((tmpName, tmpHandle), nullHandle) -> do cmdHandle <- runProcess path args Nothing Nothing Nothing (Just tmpHandle) (Just nullHandle) exitCode <- waitForProcess cmdHandle output <- readFile tmpName evaluate (length output) return (output, exitCode)

Hello Duncan, Thursday, December 13, 2007, 4:43:17 PM, you wrote:
Use just the GHC bit from the code I pointed at:
thank you, Duncan. are there any objections against simplest code proposed by Yitzchak? i.e. (_, h, _, _) <- runInteractiveCommand "script params" output <- hGetContents h taking into account that bad-behaved scripts are not my headache? -- Best regards, Bulat mailto:Bulat.Ziganshin@gmail.com

Bulat Ziganshin wrote:
Hello Duncan,
Thursday, December 13, 2007, 4:43:17 PM, you wrote:
Use just the GHC bit from the code I pointed at:
thank you, Duncan. are there any objections against simplest code proposed by Yitzchak? i.e.
(_, h, _, _) <- runInteractiveCommand "script params" output <- hGetContents h
taking into account that bad-behaved scripts are not my headache?
It could deadlock if the script produces enough stderr to fill up its pipe buffer, because the script will stop waiting for your program to empty the pipe. It's been said several times, but we should really have a higher-level abstraction over runInteractiveProcess. Cheers, Simon

2007/12/13, Simon Marlow
Bulat Ziganshin wrote:
Hello Duncan,
Thursday, December 13, 2007, 4:43:17 PM, you wrote:
Use just the GHC bit from the code I pointed at:
thank you, Duncan. are there any objections against simplest code proposed by Yitzchak? i.e.
(_, h, _, _) <- runInteractiveCommand "script params" output <- hGetContents h
taking into account that bad-behaved scripts are not my headache?
It could deadlock if the script produces enough stderr to fill up its pipe buffer, because the script will stop waiting for your program to empty the pipe.
It's been said several times, but we should really have a higher-level abstraction over runInteractiveProcess.
I think Don started something: http://www.cse.unsw.edu.au/~dons/code/newpopen/ David

Bulat Ziganshin wrote:
...are there any objections against...
(_, h, _, _) <- runInteractiveCommand "script params" output <- hGetContents h
taking into account that bad-behaved scripts are not my headache?
"bad-behaved scripts are not my headache" is not the same as "behave nicely". Or I guess they are the same - if they don't behave nicely, then they are your headache whether you like it or not. :) Simon Marlow wrote:
It could deadlock if the script produces enough stderr to fill up its pipe buffer
If we need to worry about that, then what about this: (_,h,e,_) <- runInteractiveCommand "script params" forkIO (hGetContents e >>= evaluate . length) output <- hGetContents h It requires -threaded in the case of a huge amount of output to both stdout and stderr; maybe that isn't good for Bulat. If that is a problem, you can read a chunk at a time and call yield - but that is getting messier. Thanks, Yitz

(_,h,e,_) <- runInteractiveCommand "script params" clearStderr output <- hGetContents h where clearStderr is one of: 1. hClose e 2. hGetContents e >>= evaluate . last 3. forkIO (hGetContents e >>= evaluate . last >> return ()) all seem to work for me on Mac OS. Only 2 hangs on Debian testing. What happens on Windows? I ran the following command: "for((i=0;i<10000;++i));do cat non_existant; echo aaaaaaaaaaaaaaa; done; date" (varying the number of iterations to clog up both pipe buffers) Is this really such a problem? Thanks, Yitz

On Dec 13, 2007, at 13:07 , Yitzchak Gale wrote:
(_,h,e,_) <- runInteractiveCommand "script params" clearStderr output <- hGetContents h
where clearStderr is one of:
1. hClose e 2. hGetContents e >>= evaluate . last 3. forkIO (hGetContents e >>= evaluate . last >> return ())
all seem to work for me on Mac OS. Only 2 hangs on Debian testing. What happens on Windows?
PIPE_MAX is only 512 on (older?) Solaris. It's fairly large on Linux and possibly OSX. No clue about Windows. -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

On Thu, 2007-12-13 at 19:38 +0200, Yitzchak Gale wrote:
Simon Marlow wrote:
It could deadlock if the script produces enough stderr to fill up its pipe buffer
If we need to worry about that, then what about this:
(_,h,e,_) <- runInteractiveCommand "script params" forkIO (hGetContents e >>= evaluate . length) output <- hGetContents h
It requires -threaded in the case of a huge amount
It does not require -threaded. GHC's single threaded rts has always been able to cope with multiple haskell threads that want to do file/network IO.
of output to both stdout and stderr; maybe that isn't good for Bulat. If that is a problem, you can read a chunk at a time and call yield - but that is getting messier.
Calling yield ends up busy waiting if there is no output available from either stdout or stderr. It also requires threads so it's not a portable solution. Something simple would be to allow attaching a pipe to just the stdout and redirecting stderr elsewhere, or connecting both stdout and stderr to the same output pipe. runProcess allows substituting any of stdin/stdout/stderr for other Handles and runInteractiveProcess substitutes them all for pipes. What we need is something in between that allows substituting some for given Handles and connecting others to pipes. Duncan

On Thu, 13 Dec 2007, Duncan Coutts wrote: ...
Something simple would be to allow attaching a pipe to just the stdout and redirecting stderr elsewhere, or connecting both stdout and stderr to the same output pipe. runProcess allows substituting any of stdin/stdout/stderr for other Handles and runInteractiveProcess substitutes them all for pipes. What we need is something in between that allows substituting some for given Handles and connecting others to pipes.
Right. Even if there weren't this logistical problem when you divert both stdout & stderr, it's not the right thing to do, commonly enough to even make it a default, let alone the only way. Maybe half the time, it would be better for stderr to go wherever it has been going. Maybe 2/3 of the remaining half would be better served by stderr & stdout merged on the same stream (which the caller might arrange with a shell redirection in the command, "script params 2>&1".) I imagine it would be easy enough to create the pipes to use with runProcess, but don't know how portable this would be outside the UNIX / POSIX world. Donn Cave, donn@drizzle.com
participants (8)
-
Brandon S. Allbery KF8NH
-
Bulat Ziganshin
-
David Waern
-
Donn Cave
-
Duncan Coutts
-
Henning Thielemann
-
Simon Marlow
-
Yitzchak Gale