
Another issue is that it turns out to be hard to enable certain extensions without enabling others, because it requires you to pass around more state in the compiler, and in particular in the parser (the number of different languages being parsed increases), which is why we lump all the extensions together in GHC. However, I agree the situation isn't ideal.
So here's a concrete suggestion:
1) We standardise the definition of various extensions (mptc, unboxed types, etc.) so that if two compilers both claim to support feature X, then they accept the same code. (We're pretty close to this already.)
Definitely - this has been a goal for a while. I think there was a plan to have a section on www.haskell.org listing all the extensions with their specifications.
2) We extend the package definition with a list of features used in the package.
Package { name = "net", extensions = [ "FFI", "ExceptionHandling", ... ], ... }
Let's be clear: these extensions are those required by the *consumer* of the package, right? The implementation of the package will likely require a different (probably super-) set of extensions. For example: "FFI" isn't typically required by the consumer of the "net" package, but it is certainly used in the implementation. Also, this doesn't mesh well with the portability notion in the new libraries proposal. The idea there is that certain libraries would be optional depending on whether the compiler implemented certain extensions - with your scheme this would have to be done at the package level. I wanted to lump all the core libraries into a single package on GHC, but this would mean that package "core" for GHC would require a different set of extensions than the same package for NHC. Cheers, Simon