haddock-2.3.0 literate comments discarded from .lhs input

I have this test case for Haddock (2.3.0): -------------------------------------------------- | Module : Test.Haddock Copyright : (c) 2009 Alistair Bayley License : BSD-style Maintainer : alistair@abayley.org Stability : stable Portability : portable Test case for Haddock.
module Test.Haddock ( -- $named_block Fail(..) ) where data Fail = Fail | Succeed
$named_block This is some hadock documentation. -------------------------------------------------- This fails with: [1 of 1] Compiling Test.Fail ( Test\Fail.hs, Test\Fail.o ) Test\Fail.hs:11:26: Can't make a derived instance of `Typeable Fail' (You need -XDeriveDataTypeable to derive an instance for this class) In the data type declaration for `Fail' If I manually unlit, then Haddock is happy: -------------------------------------------------- -- | -- Module : Test.Haddock2 -- Copyright : (c) 2009 Alistair Bayley -- License : BSD-style -- Maintainer : alistair@abayley.org -- Stability : stable -- Portability : portable -- -- Test case for Haddock. module Test.Haddock2 ( -- $named_block Fail(..) ) where data Fail = Fail | Succeed -- $named_block -- -- This is some hadock documentation. -------------------------------------------------- so it looks as though it's discarding the literate comments. Is this intended? I was under the impression that because it used the ghc parser, it could now properly handle .lhs input. Ona related note, we have a nice unlitter in cabal that would preserve these comments before invoking haddock. Is this still used with haddock2, or does it now assume haddock2 will do the right thing? Alistair

2009/2/6 Alistair Bayley
I have this test case for Haddock (2.3.0):
--------------------------------------------------
| Module : Test.Haddock Copyright : (c) 2009 Alistair Bayley License : BSD-style Maintainer : alistair@abayley.org Stability : stable Portability : portable
Test case for Haddock.
module Test.Haddock ( -- $named_block Fail(..) ) where data Fail = Fail | Succeed
$named_block
This is some hadock documentation.
--------------------------------------------------
This fails with:
[1 of 1] Compiling Test.Fail ( Test\Fail.hs, Test\Fail.o )
Test\Fail.hs:11:26: Can't make a derived instance of `Typeable Fail' (You need -XDeriveDataTypeable to derive an instance for this class) In the data type declaration for `Fail'
Are you processing the above module but it is called Test.Fail in reality? Have you stripped out a deriving statement from the example above? I'm very confused by this message :) David

[1 of 1] Compiling Test.Fail ( Test\Fail.hs, Test\Fail.o )
Test\Fail.hs:11:26: Can't make a derived instance of `Typeable Fail' (You need -XDeriveDataTypeable to derive an instance for this class) In the data type declaration for `Fail'
Are you processing the above module but it is called Test.Fail in reality? Have you stripped out a deriving statement from the example above? I'm very confused by this message :)
Sorry, my mistake. I pasted the error message from a different problem. This is the error I get from haddock: C:\bayleya\eclipse\workspace\takusen\src>haddock -h --odir=doc Test/Haddock.lhs Cannot find documentation for: $named_block Alistair

2009/2/6 Alistair Bayley
[1 of 1] Compiling Test.Fail ( Test\Fail.hs, Test\Fail.o )
Test\Fail.hs:11:26: Can't make a derived instance of `Typeable Fail' (You need -XDeriveDataTypeable to derive an instance for this class) In the data type declaration for `Fail'
Are you processing the above module but it is called Test.Fail in reality? Have you stripped out a deriving statement from the example above? I'm very confused by this message :)
Sorry, my mistake. I pasted the error message from a different problem. This is the error I get from haddock:
C:\bayleya\eclipse\workspace\takusen\src>haddock -h --odir=doc Test/Haddock.lhs Cannot find documentation for: $named_block
Okay, then I understand. My guess is (without looking at ghc code) that ghc just throws the literate comments away before lexing the file. This means that the Haddock comments won't be recognized. As you say, there is also an unlitter in Cabal. I don't remember if it is invoked when using Haddock 2, but if it is, it would solve this problem. David

On Fri, 2009-02-06 at 11:48 +0100, David Waern wrote:
2009/2/6 Alistair Bayley
: [1 of 1] Compiling Test.Fail ( Test\Fail.hs, Test\Fail.o )
Test\Fail.hs:11:26: Can't make a derived instance of `Typeable Fail' (You need -XDeriveDataTypeable to derive an instance for this class) In the data type declaration for `Fail'
Are you processing the above module but it is called Test.Fail in reality? Have you stripped out a deriving statement from the example above? I'm very confused by this message :)
Sorry, my mistake. I pasted the error message from a different problem. This is the error I get from haddock:
C:\bayleya\eclipse\workspace\takusen\src>haddock -h --odir=doc Test/Haddock.lhs Cannot find documentation for: $named_block
Okay, then I understand.
My guess is (without looking at ghc code) that ghc just throws the literate comments away before lexing the file. This means that the Haddock comments won't be recognized.
As you say, there is also an unlitter in Cabal. I don't remember if it is invoked when using Haddock 2, but if it is, it would solve this problem.
Yes, against my better judgement the code in Cabal for haddock-2.x does not run cpp or unliting like it does for haddock-0.x. Instead it assumes that haddock-2.x will do all the cpp and unliting itself. Obviously this mean the special unliting mode that Cabal provides is not usable with haddock-2.x. The solution is to do the pre-processing the same for haddock-0.x and 2.x. Generally the haddock code in Cabal is a horrible inconsistent mess. I believe Andrea Vezzosi has been looking at rewriting it, which is good news. Duncan

2009/2/6 Duncan Coutts
Yes, against my better judgement the code in Cabal for haddock-2.x does not run cpp or unliting like it does for haddock-0.x. Instead it assumes that haddock-2.x will do all the cpp and unliting itself. Obviously this mean the special unliting mode that Cabal provides is not usable with haddock-2.x.
The solution is to do the pre-processing the same for haddock-0.x and 2.x. Generally the haddock code in Cabal is a horrible inconsistent mess. I believe Andrea Vezzosi has been looking at rewriting it, which is good news.
In Distribution.Simple.Haddock, in the haddock function we have: withLib pkg_descr () $ \lib -> do let bi = libBuildInfo lib modules = PD.exposedModules lib ++ otherModules bi inFiles <- getLibSourceFiles lbi lib unless isVersion2 $ mockAll bi inFiles So I guess the easiest thing to do right now is remove the "unless isVersion2 $" . I'm testing this at the moment, so when I get it working (or not) I'll let you know, and maybe send a patch. Alistair

I did work on this and i simplified the code a lot fixing
inconsistencies and making more explicit what how each component
contributes to the arguments to haddock.
Aside from this, should we also do the unliting and cpp from Cabal on
the sources passed to HsColour?
On Fri, Feb 6, 2009 at 11:27 PM, Duncan Coutts
On Fri, 2009-02-06 at 11:48 +0100, David Waern wrote:
2009/2/6 Alistair Bayley
: [1 of 1] Compiling Test.Fail ( Test\Fail.hs, Test\Fail.o )
Test\Fail.hs:11:26: Can't make a derived instance of `Typeable Fail' (You need -XDeriveDataTypeable to derive an instance for this class) In the data type declaration for `Fail'
Are you processing the above module but it is called Test.Fail in reality? Have you stripped out a deriving statement from the example above? I'm very confused by this message :)
Sorry, my mistake. I pasted the error message from a different problem. This is the error I get from haddock:
C:\bayleya\eclipse\workspace\takusen\src>haddock -h --odir=doc Test/Haddock.lhs Cannot find documentation for: $named_block
Okay, then I understand.
My guess is (without looking at ghc code) that ghc just throws the literate comments away before lexing the file. This means that the Haddock comments won't be recognized.
As you say, there is also an unlitter in Cabal. I don't remember if it is invoked when using Haddock 2, but if it is, it would solve this problem.
Yes, against my better judgement the code in Cabal for haddock-2.x does not run cpp or unliting like it does for haddock-0.x. Instead it assumes that haddock-2.x will do all the cpp and unliting itself. Obviously this mean the special unliting mode that Cabal provides is not usable with haddock-2.x.
The solution is to do the pre-processing the same for haddock-0.x and 2.x. Generally the haddock code in Cabal is a horrible inconsistent mess. I believe Andrea Vezzosi has been looking at rewriting it, which is good news.
Duncan

On Sun, 2009-02-08 at 19:18 +0100, Andrea Vezzosi wrote:
I did work on this and i simplified the code a lot fixing inconsistencies and making more explicit what how each component contributes to the arguments to haddock.
Much appreciated.
Aside from this, should we also do the unliting and cpp from Cabal on the sources passed to HsColour?
Hmm. I thought it did already :-) Well, I know it runs happy, hsc2hs etc. Someone was complaining the other day that the hscolour output run on the result of happy is not really readable, but then it's not clear if running it on the happy input would be any better. For the particular case of .lhs and cpp, I hope we'd get better hscolour output by not running unlit or cpp first. Malcolm says it'll at least do something. So it seems worth checking which ends up looking more useful. Duncan
On Fri, Feb 6, 2009 at 11:27 PM, Duncan Coutts
Yes, against my better judgement the code in Cabal for haddock-2.x does not run cpp or unliting like it does for haddock-0.x. Instead it assumes that haddock-2.x will do all the cpp and unliting itself. Obviously this mean the special unliting mode that Cabal provides is not usable with haddock-2.x.
The solution is to do the pre-processing the same for haddock-0.x and 2.x. Generally the haddock code in Cabal is a horrible inconsistent mess. I believe Andrea Vezzosi has been looking at rewriting it, which is good news.

Duncan Coutts
Someone was complaining the other day that the hscolour output run on the result of happy is not really readable,
To clarify, what he said was that hscolouring Happy output did not _enhance_ its readability. In other words, you can put lipstick on a pig, but it's still a pig.
but then it's not clear if running it on the happy input would be any better.
Try it! I reckon it looks pretty good actually. Lexically, the difference between Happy and H'98 sources is negligible.
For the particular case of .lhs and cpp, I hope we'd get better hscolour output by not running unlit or cpp first. Malcolm says it'll at least do something. So it seems worth checking which ends up looking more useful.
It seems likely that preserving the literate comments is the sensible thing to do, since we are linking together documentation here (haddock/source). HsColour has -lit and -lit-tex options, to avoid colouring the literate comments from a .lhs. Regards, Malcolm

On Mon, 2009-02-09 at 15:36 +0000, Malcolm Wallace wrote:
Duncan Coutts
wrote: Someone was complaining the other day that the hscolour output run on the result of happy is not really readable,
To clarify, what he said was that hscolouring Happy output did not _enhance_ its readability. In other words, you can put lipstick on a pig, but it's still a pig.
Yep :-)
but then it's not clear if running it on the happy input would be any better.
Try it! I reckon it looks pretty good actually. Lexically, the difference between Happy and H'98 sources is negligible.
Aye. The difficulty is we have to have a list of which pre-processors to run or not run before using hscolour. Or perhaps we just hope that all original sources are ok going through hscolour, even if they're not Haskell.
For the particular case of .lhs and cpp, I hope we'd get better hscolour output by not running unlit or cpp first. Malcolm says it'll at least do something. So it seems worth checking which ends up looking more useful.
It seems likely that preserving the literate comments is the sensible thing to do, since we are linking together documentation here (haddock/source). HsColour has -lit and -lit-tex options, to avoid colouring the literate comments from a .lhs.
I'm not sure what we can do here. We don't know if the file is bird track or tex style. Can we get away with always using one option? Duncan

Aye. The difficulty is we have to have a list of which pre-processors to run or not run before using hscolour. Or perhaps we just hope that all original sources are ok going through hscolour, even if they're not Haskell.
If the pre-processor is going to turn it into Haskell, then I think it is a fair bet that the majority of the source code will be lexically compatible with Haskell, no matter which pp we are talking about. And in general, the original source will be more readable than the source after processing.
HsColour has -lit and -lit-tex options, to avoid colouring the literate comments from a .lhs.
I'm not sure what we can do here. We don't know if the file is bird track or tex style. Can we get away with always using one option?
I'm guessing that a particular library author is going to stick with the same style throughout a project, so they could choose to use an option in the .cabal file? hscolour-literate-option: -lit-tex The semantics would be that Cabal adds the "hscolour-literate-option" to the HsColour commandline, only for {.lhs, .ly, .lx} files. If the hscolour-literate-option is not specified, then it defaults to "-lit". (The opposite default would of course be possible - I don't know which style is the majority preference.) Regards, Malcolm

2009/2/8 Andrea Vezzosi
I did work on this and i simplified the code a lot fixing inconsistencies and making more explicit what how each component contributes to the arguments to haddock.
Is this in HEAD? If so then I have some changes. Attached is a patch. Alistair

2009/2/12 Alistair Bayley
2009/2/8 Andrea Vezzosi
: I did work on this and i simplified the code a lot fixing inconsistencies and making more explicit what how each component contributes to the arguments to haddock.
Is this in HEAD? If so then I have some changes. Attached is a patch.
No comments. Is this patch OK? Are you likely to apply/fix this for the next Cabal release? Would you like me to do some more work on it? etc etc Alistair

On Thu, 2009-02-19 at 13:19 +0000, Alistair Bayley wrote:
2009/2/12 Alistair Bayley
: 2009/2/8 Andrea Vezzosi
: I did work on this and i simplified the code a lot fixing inconsistencies and making more explicit what how each component contributes to the arguments to haddock.
Is this in HEAD? If so then I have some changes. Attached is a patch.
No comments.
Ah, I was hoping you and Saizan could work this out together.
Is this patch OK? Are you likely to apply/fix this for the next Cabal release? Would you like me to do some more work on it? etc etc
It will not be in 1.6.0.2 because that is very nearly released already. I'm hoping we that it can go into the HEAD version but I'd like some comment from Saizan on if this would get in the way of his changes or if perhaps he's already integrated this in his changes. Duncan

Sorry for the delay, the patch "rewrite of
Distribution.Simple.Haddock" includes these changes among the others.
I'm not really satisfied by the readability of some parts of the code,
though, but it should be a consistent improvement anyhow.
It probably won't merge in the 1.6 branch as-is because of changes in
Distribution.Simple.Utils
On Thu, Feb 19, 2009 at 3:12 PM, Duncan Coutts
On Thu, 2009-02-19 at 13:19 +0000, Alistair Bayley wrote:
2009/2/12 Alistair Bayley
: 2009/2/8 Andrea Vezzosi
: I did work on this and i simplified the code a lot fixing inconsistencies and making more explicit what how each component contributes to the arguments to haddock.
Is this in HEAD? If so then I have some changes. Attached is a patch.
No comments.
Ah, I was hoping you and Saizan could work this out together.
Is this patch OK? Are you likely to apply/fix this for the next Cabal release? Would you like me to do some more work on it? etc etc
It will not be in 1.6.0.2 because that is very nearly released already.
I'm hoping we that it can go into the HEAD version but I'd like some comment from Saizan on if this would get in the way of his changes or if perhaps he's already integrated this in his changes.
Duncan

On Thu, 2009-02-19 at 16:53 +0100, Andrea Vezzosi wrote:
Sorry for the delay, the patch "rewrite of Distribution.Simple.Haddock" includes these changes among the others. I'm not really satisfied by the readability of some parts of the code, though, but it should be a consistent improvement anyhow. It probably won't merge in the 1.6 branch as-is because of changes in Distribution.Simple.Utils
Thanks very much for working on this. I'll try and review it soon. I'm happy for it to stay on the HEAD branch and not go to the 1.6 branch. If Alistair thinks its important to get a minor change on the 1.6 branch then I'll also consider that. It's a bit difficult though, changing how we pre-process stuff for haddock is not really a minor change. It makes me a bit nervous changing that stuff on the stable 1.6 branch. Duncan

On Thu, 2009-02-12 at 09:21 +0000, Alistair Bayley wrote:
2009/2/8 Andrea Vezzosi
: I did work on this and i simplified the code a lot fixing inconsistencies and making more explicit what how each component contributes to the arguments to haddock.
Is this in HEAD? If so then I have some changes. Attached is a patch.
Alistair, Ok, all of Andrea's haddock changes are in Cabal HEAD now, so if you'd like to look at the .lhs situation now that'd be great. Duncan

Ok, all of Andrea's haddock changes are in Cabal HEAD now, so if you'd like to look at the .lhs situation now that'd be great.
I've finally tested this. It looks like Distribution.Simple.Haddock module does what I want; now I just have to deal with the haddock problems. BTW, would it be a good idea to add (yet) another cabal flag that retains temp files/directories? This would make it a lot easier to diagnose issues with preprocessed source without having to modify and rebuild cabal. The 2 haddock problems I have now are: (1) When cabal calls haddock it passes: --read-interface=$httptopdir/doc/libraries/base,c:\ghc\ghc-6.10.1/doc/libraries/base\base.haddock This results in links in the html docs like this (e.g. for Either): file:///C:/bayleya/eclipse/workspace/takusen/src/dist/doc/html/Takusen/$httptopdir/doc/libraries/base/Data-Either.html#t%3AEither i.e. the $httptopdir isn't being expanded, or whatever is meant to happen to it. Is this a haddock issue, or a cabal issue? (2) Also, there seems to be a bug in Haddock where preformatted text in the HTML output has extra CR inserted before the CRLF. Presumably this only affects Windows users. I assume I should raise this with Gwern. Alistair

2009/3/18 Alistair Bayley
Ok, all of Andrea's haddock changes are in Cabal HEAD now, so if you'd like to look at the .lhs situation now that'd be great.
I've finally tested this. It looks like Distribution.Simple.Haddock module does what I want; now I just have to deal with the haddock problems.
BTW, would it be a good idea to add (yet) another cabal flag that retains temp files/directories? This would make it a lot easier to diagnose issues with preprocessed source without having to modify and rebuild cabal.
The 2 haddock problems I have now are:
(1) When cabal calls haddock it passes:
--read-interface=$httptopdir/doc/libraries/base,c:\ghc\ghc-6.10.1/doc/libraries/base\base.haddock
This results in links in the html docs like this (e.g. for Either): file:///C:/bayleya/eclipse/workspace/takusen/src/dist/doc/html/Takusen/$httptopdir/doc/libraries/base/Data-Either.html#t%3AEither
i.e. the $httptopdir isn't being expanded, or whatever is meant to happen to it.
Is this a haddock issue, or a cabal issue?
Haddock doesn't expand this variable. I remember seeing code to expand it in Cabal, though. Maybe it's just not used for Haddock 2? Could be something we want to support in Haddock in the future, though. For example to allow a --package flag, to avoid having to specify files directly using --read-interface. Then Haddock needs to be able to expand this variable.
(2) Also, there seems to be a bug in Haddock where preformatted text in the HTML output has extra CR inserted before the CRLF. Presumably this only affects Windows users. I assume I should raise this with Gwern.
This is fixed in the soon-to-be-released GHC 6.10.2. David

2009/3/18 David Waern
Could be something we want to support in Haddock in the future, though. For example to allow a --package flag, to avoid having to specify files directly using --read-interface. Then Haddock needs to be able to expand this variable.
Here, I just mean that since the expanding code might be needed in the future, we could expand the paramater to --read-interface too. David

On Wed, 2009-03-18 at 23:24 +0100, David Waern wrote:
Haddock doesn't expand this variable. I remember seeing code to expand it in Cabal, though. Maybe it's just not used for Haddock 2?
Could be something we want to support in Haddock in the future, though. For example to allow a --package flag, to avoid having to specify files directly using --read-interface. Then Haddock needs to be able to expand this variable.
It certainly used to have a -package flag, Cabal used to use it. However we decided it was better to switch to passing the files directly because it gave us more control. For example we can check if the files really exist, or we could if we wanted to, work with docs for packages that are not yet registered. Duncan

On Wed, Mar 18, 2009 at 10:54 PM, Alistair Bayley
Ok, all of Andrea's haddock changes are in Cabal HEAD now, so if you'd like to look at the .lhs situation now that'd be great.
I've finally tested this. It looks like Distribution.Simple.Haddock module does what I want; now I just have to deal with the haddock problems.
BTW, would it be a good idea to add (yet) another cabal flag that retains temp files/directories? This would make it a lot easier to diagnose issues with preprocessed source without having to modify and rebuild cabal. To be honest there isn't a compelling reason to delete the intermediate files/directories, we could just use a fixed location under dist/ to store them. It can only lead to bugs if we don't overwrite some relevant files with up to date versions the next time the user calls cabal haddock, but that's already the case for dist/build.
The 2 haddock problems I have now are:
(1) When cabal calls haddock it passes:
--read-interface=$httptopdir/doc/libraries/base,c:\ghc\ghc-6.10.1/doc/libraries/base\base.haddock
This results in links in the html docs like this (e.g. for Either): file:///C:/bayleya/eclipse/workspace/takusen/src/dist/doc/html/Takusen/$httptopdir/doc/libraries/base/Data-Either.html#t%3AEither
i.e. the $httptopdir isn't being expanded, or whatever is meant to happen to it.
Is this a haddock issue, or a cabal issue? Those variables are supposed to be expanded by Cabal, and here on linux i get: --read-interface=/usr/local/share/doc/ghc/libraries/base,/usr/local/share/doc/ghc/libraries/base/base.haddock I'm going to try on windows to see if that's the problem.
(2) Also, there seems to be a bug in Haddock where preformatted text in the HTML output has extra CR inserted before the CRLF. Presumably this only affects Windows users. I assume I should raise this with Gwern.
Alistair

i.e. the $httptopdir isn't being expanded, or whatever is meant to happen to it.
Those variables are supposed to be expanded by Cabal, and here on linux i get: --read-interface=/usr/local/share/doc/ghc/libraries/base,/usr/local/share/doc/ghc/libraries/base/base.haddock
Really? I searched the cabal source for httptopdir and found nothing, so I was wondering where it was meant to happen (of course it could be that my search was incorrectly specified). Do you know which module is meant to expand this? Distribution.Simple.InstallDirs.defaultInstallDirs looked a likely candidate, but no sign of $httptopdir there. Alistair

Yeah, sorry, i was commenting on the general kind of variables, not on
this one in particular, which i don't really know from which code it
comes from.
I can see it in "ghc-pkg describe $pkg" where $pkg is a core package
but not in packages installed with cabal-install, so there's probably
something wrong with the ghc build system or installer on windows.
On Thu, Mar 19, 2009 at 12:01 PM, Alistair Bayley
i.e. the $httptopdir isn't being expanded, or whatever is meant to happen to it.
Those variables are supposed to be expanded by Cabal, and here on linux i get: --read-interface=/usr/local/share/doc/ghc/libraries/base,/usr/local/share/doc/ghc/libraries/base/base.haddock
Really? I searched the cabal source for httptopdir and found nothing, so I was wondering where it was meant to happen (of course it could be that my search was incorrectly specified). Do you know which module is meant to expand this? Distribution.Simple.InstallDirs.defaultInstallDirs looked a likely candidate, but no sign of $httptopdir there.
Alistair

It turns out that those variables are there to allow relocation, in fact $topdir is expanded by Distribution.Simple.GHC.getInstalledPackages, it seems that $httptopdir has been overlooked. I'd be tempted to say that it's ghc-pkg dump/describe responsibility to expand those vars instead, like it does for ghc-pkg field.

Andrea,
2009/3/19 Andrea Vezzosi
It turns out that those variables are there to allow relocation, in fact $topdir is expanded by Distribution.Simple.GHC.getInstalledPackages, it seems that $httptopdir has been overlooked. I'd be tempted to say that it's ghc-pkg dump/describe responsibility to expand those vars instead, like it does for ghc-pkg field.
Do you (or anyone else) intend to work on this? If not, I'd like to fix it, but I'll need some guidance. Like, is Distribution.Simple.GHC.getInstalledPackages where the variable expansion code should go, or should it be somewhere else? Alistair

On Wed, 2009-05-27 at 15:10 +0100, Alistair Bayley wrote:
Andrea,
2009/3/19 Andrea Vezzosi
: It turns out that those variables are there to allow relocation, in fact $topdir is expanded by Distribution.Simple.GHC.getInstalledPackages, it seems that $httptopdir has been overlooked. I'd be tempted to say that it's ghc-pkg dump/describe responsibility to expand those vars instead, like it does for ghc-pkg field.
Do you (or anyone else) intend to work on this? If not, I'd like to fix it, but I'll need some guidance. Like, is Distribution.Simple.GHC.getInstalledPackages where the variable expansion code should go, or should it be somewhere else?
I don't think we should be hacking around this in Cabal without any discussion with the ghc folks on what is supposed to be there, what variables are allowed. We need a clear spec on what variables tools are expected to handle and how they are to be interpreted. The output of ghc-pkg describe/dump is not just for ghc to define and play around with. It's supposed to be defined by the Cabal spec. Supporting relocatable sets of packages is a good idea. We should aim to have something that is usable by each compiler, not just ghc, so interpreting paths relative to ghc's libdir doesn't seem ideal. How about this: a way to specify paths in the package registration info that are relative to the location of the package db they are in. That makes sense beyond just ghc and even with would allow other sets of relocatable packages, not just those installed with ghc. Then perhaps as a compat hack we should get Cabal to handle older ghc versions that do use these funny vars. Duncan
participants (5)
-
Alistair Bayley
-
Andrea Vezzosi
-
David Waern
-
Duncan Coutts
-
Malcolm Wallace