advance warning of Cabal-1.6.0.2 and package breakages

All,
I'll be releasing Cabal-1.6.0.2 soon. I've been testing with packages on
hackage for regressions. However even with the regressions fixed there
will be a small handful of packages that will break with the new
release. This is because they are already incorrect it's just that this
was previously hidden.
All these errors get picked up now because the new Cabal version checks
that header files and C libs can be found at configure time. This check
is generally a great help to users. Unfortunately in the case of these
packages it picks up issues that did not always previously cause build
failure (though could cause failure in some configurations).
So I'm sorry that these 6 package now break. They are all genuine
pre-existing errors though and fixing them seems a small price to pay
for a better user experience for the other 1000 packages on hackage.
I'm cc'ing the maintainers of the 6 packages. There is no need to
release immediately, in fact you may like to wait for the release and
test against it yourself to confirm whatever fixes you make.
These are the packages and their errors:
digest-0.0.0.2:
This lists zutil.u as an include file. This is just a spelling
error. It should be zutil.h of course. This does break with
ghc-6.8 using -fvia-C. The behaviour of ghc-6.10 masks this
error.
hopenssl-1.0:
This lists

On Tue, 2009-02-17 at 00:24 +0000, Duncan Coutts wrote:
All,
I'll be releasing Cabal-1.6.0.2 soon. I've been testing with packages on hackage for regressions. However even with the regressions fixed there will be a small handful of packages that will break with the new release. This is because they are already incorrect it's just that this was previously hidden.
All these errors get picked up now because the new Cabal version checks that header files and C libs can be found at configure time. This check is generally a great help to users. Unfortunately in the case of these packages it picks up issues that did not always previously cause build failure (though could cause failure in some configurations).
So I'm sorry that these 6 package now break. They are all genuine pre-existing errors though and fixing them seems a small price to pay for a better user experience for the other 1000 packages on hackage.
I'm cc'ing the maintainers of the 6 packages. There is no need to release immediately, in fact you may like to wait for the release and test against it yourself to confirm whatever fixes you make.
These are the packages and their errors:
Oh, one more: cedict-0.2.5 The tarball is missing c/data.h. It happens not to fail with ghc-6.10 but almost certainly will fail with 6.8. sdist also fails. Duncan

On Tue, 2009-02-17 at 00:24 +0000, Duncan Coutts wrote:
All,
I'll be releasing Cabal-1.6.0.2 soon. I've been testing with packages on hackage for regressions. However even with the regressions fixed there will be a small handful of packages that will break with the new release. This is because they are already incorrect it's just that this was previously hidden.
I should like to point out that the fact that Cabal now does the checks for C header files and libs means we can generally do a better job now if the packages themselves do not try to do anything clever. We get more consistent error messages but we also take into account extra flags passed in by the user which a custom check may omit (which means the user may not be able to install at all if they have the C libs in a non-standard location). For example: jack-0.5: Searching for jack/jack.h...setup: user error (ERROR: jack/jack.h not found) And it ignores the --extra-include-dirs flag. If you maintain a FFI binding package, take a look to see if it has redundant checks in the ./configure or Setup.hs for C headers and libs. While we're on the topic of good practise in configure scripts. Try to avoid if possible grabbing random environment variables and putting them into a .buildinfo file. For example $CPPFLAGS or $LDFLAGS. The current way for users to specify non-standard install locations for C libs is via --extra-include-dirs and --extra-lib-dirs. If you want to argue that env vars are the better user interface then make the case and we people agree then we can do it in Cabal itself. Having it different in each package just confuses users. The other problem with putting the env vars into .buildinfo files is that the cc-options, ld-options don't just get used to build your package but they get put into the package registration info and used when building every dependent package. More advice: * Don't make configure set buildable: False. Just fail. * Do declare the key header files your package needs in the .cabal file rather than just in .hs or .hsc files. That way Cabal can check for them and tools that convert to native distro packages will notice the dependencies on foreign packages. Looking at the failing packages on hackage at the moment we've got about 100 failing at build time and about 50 failing at configure time. A bunch more fail because they depend on other things that failed. The biggest cause for build failure seems to be incorrectly specified dependencies. That is they probably worked against one version and now fail to compile against some different version. Quite often that is the base package, but it's not the majority. Duncan

Thanks for the heads-up, Duncan. For the BLAS package, the situation is slightly complicated, and I will still be sticking with autoconf for now. The reasons are twofold: 1. The functions I am binding to are written in Fortran, and have different calling conventions on different architectures. On some architectures, the equivalent C function names have a trailing underscore (e.g. "dgemm_"), and on some architectures they don't (e.g. "dgemm"). 2. There is no standard name for a BLAS shared library. The autoconf macro I use checks in libraries named atlas, blas, mkl, cxml, dxml, sunperf, essl, and a few others. For me to switch to git rid of autoconf, I would need Cabal to provide the following: 1. A way to find out the C calling convention for Fortran functions. 2. A way to check if a specific function exists in a library. 3. An option to try a different library name if a check for a library fails. For reference, here is the autoconf macro I am using: http://github.com/patperry/blas/blob/master/m4/ax_blas.m4 Patrick On Feb 18, 2009, at 5:23 AM, Duncan Coutts wrote:
On Tue, 2009-02-17 at 00:24 +0000, Duncan Coutts wrote:
All,
I'll be releasing Cabal-1.6.0.2 soon. I've been testing with packages on hackage for regressions. However even with the regressions fixed there will be a small handful of packages that will break with the new release. This is because they are already incorrect it's just that this was previously hidden.
I should like to point out that the fact that Cabal now does the checks for C header files and libs means we can generally do a better job now if the packages themselves do not try to do anything clever. We get more consistent error messages but we also take into account extra flags passed in by the user which a custom check may omit (which means the user may not be able to install at all if they have the C libs in a non-standard location).
For example:
jack-0.5: Searching for jack/jack.h...setup: user error (ERROR: jack/jack.h not found)
And it ignores the --extra-include-dirs flag.
If you maintain a FFI binding package, take a look to see if it has redundant checks in the ./configure or Setup.hs for C headers and libs.
While we're on the topic of good practise in configure scripts. Try to avoid if possible grabbing random environment variables and putting them into a .buildinfo file. For example $CPPFLAGS or $LDFLAGS. The current way for users to specify non-standard install locations for C libs is via --extra-include-dirs and --extra-lib-dirs.
If you want to argue that env vars are the better user interface then make the case and we people agree then we can do it in Cabal itself. Having it different in each package just confuses users.
The other problem with putting the env vars into .buildinfo files is that the cc-options, ld-options don't just get used to build your package but they get put into the package registration info and used when building every dependent package.
More advice:
* Don't make configure set buildable: False. Just fail. * Do declare the key header files your package needs in the .cabal file rather than just in .hs or .hsc files. That way Cabal can check for them and tools that convert to native distro packages will notice the dependencies on foreign packages.
Looking at the failing packages on hackage at the moment we've got about 100 failing at build time and about 50 failing at configure time. A bunch more fail because they depend on other things that failed. The biggest cause for build failure seems to be incorrectly specified dependencies. That is they probably worked against one version and now fail to compile against some different version. Quite often that is the base package, but it's not the majority.
Duncan

On Wed, 2009-02-18 at 08:54 -0800, Patrick Perry wrote:
Thanks for the heads-up, Duncan. For the BLAS package, the situation is slightly complicated, and I will still be sticking with autoconf for now. The reasons are twofold:
Sorry, I was not actually suggesting that your blas package can drop its configure script yet. The only suggestion that applied to your packages (glas and gsl-random) was:
More advice:
* Don't make configure set buildable: False. Just fail.
I should have made that clearer when I added you to the cc list. The reason behind this advice is that it pinpoints the location and cause of failure more precisely. If the configure script discovers the prerequisites are not met it should print a helpful error and fail there and then.
For me to switch to git rid of autoconf, I would need Cabal to provide the following:
1. A way to find out the C calling convention for Fortran functions. 2. A way to check if a specific function exists in a library. 3. An option to try a different library name if a check for a library fails.
I think we'll be able to do 3. automatically in the future. For 2. we should probably supply a library function that Setup.hs scripts can use. As for 1. that probably just needs a custom test in the Setup.hs. Don't worry about it for now though. Duncan
participants (2)
-
Duncan Coutts
-
Patrick Perry