
Another issue is that it turns out to be hard to enable certain extensions without enabling others, because it requires you to pass around more state in the compiler, and in particular in the parser (the number of different languages being parsed increases), which is why we lump all the extensions together in GHC. However, I agree the situation isn't ideal.
So here's a concrete suggestion:
1) We standardise the definition of various extensions (mptc, unboxed types, etc.) so that if two compilers both claim to support feature X, then they accept the same code. (We're pretty close to this already.)
Definitely - this has been a goal for a while. I think there was a plan to have a section on www.haskell.org listing all the extensions with their specifications.
2) We extend the package definition with a list of features used in the package.
Package { name = "net", extensions = [ "FFI", "ExceptionHandling", ... ], ... }
Let's be clear: these extensions are those required by the *consumer* of the package, right? The implementation of the package will likely require a different (probably super-) set of extensions. For example: "FFI" isn't typically required by the consumer of the "net" package, but it is certainly used in the implementation. Also, this doesn't mesh well with the portability notion in the new libraries proposal. The idea there is that certain libraries would be optional depending on whether the compiler implemented certain extensions - with your scheme this would have to be done at the package level. I wanted to lump all the core libraries into a single package on GHC, but this would mean that package "core" for GHC would require a different set of extensions than the same package for NHC. Cheers, Simon

Simon Marlow
Let's be clear: these extensions are those required by the *consumer* of the package, right?
Blink! I hadn't been thinking of it that way. For a binary package, I guess that is all it means but it seems that a source package could/should distinguish between which flags you need to compile the code and which to compile against it.
Also, this doesn't mesh well with the portability notion in the new libraries proposal. The idea there is that certain libraries would be optional depending on whether the compiler implemented certain extensions - with your scheme this would have to be done at the package level.
Hmmm, my mental model of packages was a group of tightly coupled modules so I wasn't making this distinction.
I wanted to lump all the core libraries into a single package on GHC, but this would mean that package "core" for GHC would require a different set of extensions than the same package for NHC.
That's more because of conditional compilation than the library/package distinction right? Hmmm, conditional compilation complicates the story for source packages. Choices: o one package per combination of cpp flags o one package with some sort of conditionals inside it: deps = case COMPILER of __GHC__ => net, lang __NHC__ => lang __HUGS__ => greencard Neither is very appealing. Is there an existing story for this sort of thing? -- Alastair Reid reid@cs.utah.edu http://www.cs.utah.edu/~reid/ ps I've been thinking that we should add conditional compilation to the list of extensions (but with the intention that all compilers would support it). I'm thinking all we need and want is conditional compilation but #define's would be restricted to a config file (maybe) and macro expansion would be limited to the expression part of #if. I think this is what Marcin put in hsc2hs so maybe he can comment?
participants (2)
-
Alastair David Reid
-
Simon Marlow