
By the way, as I've been thinking about Cabal, I remembered this question to which I haven't found an answer. I realize that I haven't been party to the initial discussions, but John Meacham didn't know the answer either, so I will ask it. As I understand it, Cabal is meant to provide people with a compiler-independent way to build Haskell packages. So it (a) builds packages while (b) abstracting over compiler interfaces. Why was the decision made to couple these two functions into a single project? It seems to me that maybe having two separate projects would be more manageable and useful. - The first project would provide a standard command-line interface to Haskell compilers. There would be operations such as: compile a file into an object file (may be a no-op); link a bunch of object files into a package; install a package into the following package database; create an empty package database; merge two package databases; find dependencies of a module. And options like: use the following other packages, search for packages in the following package database. Basically, the things that compilers can do, but given a standard interface. - The second project would provide a build system. I guess there would be a lot of options here. There seems to be a demand for a system with some main declarative specification file which can be processed by external tools to find out what dependencies a project has on other projects, etc., which Cabal is responding to. So perhaps people would write something like this. However, if there were two separate projects, then developers would also have the freedom to use a build system like 'make', or something more sophisticated and unforeseen, with the generic compiler interface as well. By decoupling the standard compiler interface and the build system, one would be able to make improvements independently to each one. Design and development would be faster, users would have more choices, it would be easier to experiment with new versions of each project. So, back to my question. Why couple the two? Frederik -- http://ofb.net/~frederik/

The reason to combine them is that the point of abstraction between the compilers / interpreters lives in areas like "preprocess this file", "build this file", "install this library", not in operations like you mentioned:
There would be operations such as: compile a file into an object file (may be a no-op); link a bunch of object files into a package; install a package into the following package database; create an empty package database; merge two package databases; find dependencies of a module. And options like: use the following other packages, search for packages in the following package database.
Each of these operations would be a no-op for Hugs, and yet the abstraction layer that Cabal provides works quite well for Hugs. I think that if we had built a standard command-line interface between compilers, GHC and Hugs would have almost disjoint operations, and Cabal would still have to perform the same amount of work, since it couldn't actually use this interface to get its job done. peace, isaac

On Sun, Sep 04, 2005 at 08:01:48PM -0700, Isaac Jones wrote:
The reason to combine them is that the point of abstraction between the compilers / interpreters lives in areas like "preprocess this file", "build this file", "install this library", not in operations like you mentioned:
I mentioned the second two of those, but not the first one. Preprocessing files, however, doesn't sound like an operation that needs to be done in a compiler/interpreter-specific way.
There would be operations such as: compile a file into an object file (may be a no-op); link a bunch of object files into a package; install a package into the following package database; create an empty package database; merge two package databases; find dependencies of a module. And options like: use the following other packages, search for packages in the following package database.
Each of these operations would be a no-op for Hugs, and yet the abstraction layer that Cabal provides works quite well for Hugs. I think that if we had built a standard command-line interface between compilers, GHC and Hugs would have almost disjoint operations, and Cabal would still have to perform the same amount of work, since it couldn't actually use this interface to get its job done.

Am Montag, 5. September 2005 08:13 schrieb Frederik Eaton:
On Sun, Sep 04, 2005 at 08:01:48PM -0700, Isaac Jones wrote:
The reason to combine them is that the point of abstraction between the compilers / interpreters lives in areas like "preprocess this file", "build this file", "install this library", not in operations like you mentioned:
I mentioned the second two of those, but not the first one. Preprocessing files, however, doesn't sound like an operation that needs to be done in a compiler/interpreter-specific way. [...]
At least in a restricted sense it i*is* compiler/interpreter-specific, because one has to #define some magic things like __HUGS__, __GLASGOW_HASKELL__, etc. to get conditional compilation right. Cheers, S.

Sven.Panne:
Am Montag, 5. September 2005 08:13 schrieb Frederik Eaton:
On Sun, Sep 04, 2005 at 08:01:48PM -0700, Isaac Jones wrote:
The reason to combine them is that the point of abstraction between the compilers / interpreters lives in areas like "preprocess this file", "build this file", "install this library", not in operations like you mentioned:
I mentioned the second two of those, but not the first one. Preprocessing files, however, doesn't sound like an operation that needs to be done in a compiler/interpreter-specific way. [...]
At least in a restricted sense it i*is* compiler/interpreter-specific, because one has to #define some magic things like __HUGS__, __GLASGOW_HASKELL__, etc. to get conditional compilation right.
Which reminds me, 'runhaskell Setup.hs haddock' doesn't seem to generate output for code inside #if defined(__GLASGOW_HASKELL__). Poking around in the source I see: Distribution.PreProcess: hcDefines :: Compiler -> [String] hcDefines Compiler { compilerFlavor=NHC, compilerVersion=version } = ["-D__NHC__=" ++ versionInt version] hcDefines Compiler { compilerFlavor=Hugs } = ["-D__HUGS__"] hcDefines _ = [] Which seems to be missing a case? -- Don

dons@cse.unsw.edu.au (Donald Bruce Stewart) writes:
Sven.Panne:
Am Montag, 5. September 2005 08:13 schrieb Frederik Eaton:
On Sun, Sep 04, 2005 at 08:01:48PM -0700, Isaac Jones wrote:
The reason to combine them is that the point of abstraction between the compilers / interpreters lives in areas like "preprocess this file", "build this file", "install this library", not in operations like you mentioned:
I mentioned the second two of those, but not the first one. Preprocessing files, however, doesn't sound like an operation that needs to be done in a compiler/interpreter-specific way. [...]
At least in a restricted sense it i*is* compiler/interpreter-specific, because one has to #define some magic things like __HUGS__, __GLASGOW_HASKELL__, etc. to get conditional compilation right.
Which reminds me, 'runhaskell Setup.hs haddock' doesn't seem to generate output for code inside #if defined(__GLASGOW_HASKELL__). Poking around in the source I see:
Distribution.PreProcess: hcDefines :: Compiler -> [String] hcDefines Compiler { compilerFlavor=NHC, compilerVersion=version } = ["-D__NHC__=" ++ versionInt version] hcDefines Compiler { compilerFlavor=Hugs } = ["-D__HUGS__"] hcDefines _ = []
I'm guessing this is because usually cpp is run w/ ghc -cpp, so there's no need for this case. Probably the haddock generator uses hscpp or something, and so this case should be added. Any objections? peace, isaac

dons@cse.unsw.edu.au (Donald Bruce Stewart) writes:
Sven.Panne:
Am Montag, 5. September 2005 08:13 schrieb Frederik Eaton:
On Sun, Sep 04, 2005 at 08:01:48PM -0700, Isaac Jones wrote:
The reason to combine them is that the point of abstraction between the compilers / interpreters lives in areas like "preprocess this file", "build this file", "install this library", not in operations like you mentioned:
I mentioned the second two of those, but not the first one. Preprocessing files, however, doesn't sound like an operation that needs to be done in a compiler/interpreter-specific way. [...]
At least in a restricted sense it i*is* compiler/interpreter-specific, because one has to #define some magic things like __HUGS__, __GLASGOW_HASKELL__, etc. to get conditional compilation right.
Which reminds me, 'runhaskell Setup.hs haddock' doesn't seem to generate output for code inside #if defined(__GLASGOW_HASKELL__). Poking around in the source I see:
Care to try this again w/ the CVS or darcs version? Me & Ross simultaneously fixed it ;) peace, isaac

Frederik Eaton
On Sun, Sep 04, 2005 at 08:01:48PM -0700, Isaac Jones wrote:
The reason to combine them is that the point of abstraction between the compilers / interpreters lives in areas like "preprocess this file", "build this file", "install this library", not in operations like you mentioned:
I mentioned the second two of those, but not the first one.
You mentioned taking a .hs file and producing a .o file; that's not actually the same thing. Hugs doesn't use .o files, but if you look at Distribution.Simple.Build, you'll see that "building" for hugs (that is, preparing for installation) is far from a no-op. You mentioned "taking a package and installing it into a package database" but of course, Hugs doesn't have a package database, as we've discussed, and actually "installing" (that is, making libraries and executables available to end users) for Hugs is also not just a matter of copying some files over. So in short, the reason we don't put the level of abstraction into the operations you mention is because they aren't shared between the compilers / interpreters, so there's nothing to abstract. It's probably the case, though, that actual compilers like ghc, jhc, and nhc have much more similar models, and there could be some useful abstractions there.
Preprocessing files, however, doesn't sound like an operation that needs to be done in a compiler/interpreter-specific way.
That's true, ther's almost no compiler-specific code in Distribution.PreProcess, except to pass some definitions to CPP. peace, isaac
There would be operations such as: compile a file into an object file (may be a no-op); link a bunch of object files into a package; install a package into the following package database; create an empty package database; merge two package databases; find dependencies of a module. And options like: use the following other packages, search for packages in the following package database.
Each of these operations would be a no-op for Hugs, and yet the abstraction layer that Cabal provides works quite well for Hugs. I think that if we had built a standard command-line interface between compilers, GHC and Hugs would have almost disjoint operations, and Cabal would still have to perform the same amount of work, since it couldn't actually use this interface to get its job done.
-- http://ofb.net/~frederik/ _______________________________________________ Libraries mailing list Libraries@haskell.org http://www.haskell.org/mailman/listinfo/libraries

To clarify: Maybe I should have pointed out more explicitly in this message that I *do* understand that Hugs is an interpreter, and doesn't produce object files, or compiled packages. I understand how Cabal works with GHC and Hugs. My question, why a build system is needed to abstract away their differences, has yet to be addressed. It could be that the answer is simply that it was thought that the current way of doing things would just be more convenient, but in any case I think the question should be addressed and not evaded. A build system is something that manages dependencies between and schedules the execution of commands. I see no reason why all Haskell compilers and interpreters could not be given an standard interface which (1) provides a way to find dependencies of sources (2) provides a way to execute all the necessary commands supported by Cabal, so that any build system could be used with any compiler to build any Haskell code, and do what Cabal does now. Furthermore, I would like to point out that this issue may have much greater importance in Haskell than in other languages, due to the fact that it is hard to imagine a standard Haskell ABI. In C, one can compile ELF libraries with different compilers (I think) and still use them in the same application - this is the layer where the standardization exists. Haskell, on the other hand, needs to have library and executable compiled with the same compiler, or interpreted with the same interpreter. It is good of the Cabal people to recognize this need and address it, but I'm inclined to think that a light-weight compiler interface would be more appropriate as a standard, than one which is tightly coupled with a build system. Note that I'm not suggesting Cabal be abandoned - it would be possible to develop a compiler interface standard, and modify Cabal to use it. Cabal would become much simpler; conforming to the standard would become the task of individual compiler developers. Frederik On Sun, Sep 04, 2005 at 02:15:12PM -0700, Frederik Eaton wrote:
By the way, as I've been thinking about Cabal, I remembered this question to which I haven't found an answer. I realize that I haven't been party to the initial discussions, but John Meacham didn't know the answer either, so I will ask it.
As I understand it, Cabal is meant to provide people with a compiler-independent way to build Haskell packages. So it (a) builds packages while (b) abstracting over compiler interfaces. Why was the decision made to couple these two functions into a single project? It seems to me that maybe having two separate projects would be more manageable and useful.
- The first project would provide a standard command-line interface to Haskell compilers. There would be operations such as: compile a file into an object file (may be a no-op); link a bunch of object files into a package; install a package into the following package database; create an empty package database; merge two package databases; find dependencies of a module. And options like: use the following other packages, search for packages in the following package database. Basically, the things that compilers can do, but given a standard interface.
- The second project would provide a build system. I guess there would be a lot of options here. There seems to be a demand for a system with some main declarative specification file which can be processed by external tools to find out what dependencies a project has on other projects, etc., which Cabal is responding to. So perhaps people would write something like this. However, if there were two separate projects, then developers would also have the freedom to use a build system like 'make', or something more sophisticated and unforeseen, with the generic compiler interface as well.
By decoupling the standard compiler interface and the build system, one would be able to make improvements independently to each one. Design and development would be faster, users would have more choices, it would be easier to experiment with new versions of each project.
So, back to my question. Why couple the two?
Frederik
participants (4)
-
dons@cse.unsw.edu.au
-
Frederik Eaton
-
Isaac Jones
-
Sven Panne