why is ghci trying to load hsc file ??

Howdy, I worked out a small hdf5 binding using cabal and bindings-DSL and sqlite3 as my example. Time to try it ! ghci -idist/build/ dist/build/Bindings/HDF5.o -lhdf5 -lhdf5_hl hdf5_pkg_test.hs GHCi, version 6.12.1: http://www.haskell.org/ghc/ :? for help Loading package ghc-prim ... linking ... done. Loading package integer-gmp ... linking ... done. Loading package base ... linking ... done. Loading object (static) dist/build/Bindings/HDF5.o ... done Loading object (dynamic) hdf5 ... done Loading object (dynamic) hdf5_hl ... done final link ... done [1 of 2] Compiling Bindings.HDF5 ( dist/build/Bindings/HDF5.hs, interpreted ) src/Bindings/HDF5.hsc:49:8: parse error on input `import' Failed, modules loaded: none. Huh ? Why is it trying to read HDF5.hsc ?? What's even more interesting is that line 49 of that file doesn't have an import on it, so something is fubar. No idea how this could be happening. I've included a copy of my cabal file. BTW. I have to specify the hdf5 libraries, i.e. libhdf5 and libhdf5_hl on the command line. It seems like the build process should have taken care of that in some way, maybe... ? Certainly when I use something like sqlite3, I'm not specifying libsqlite3 on the command line. Thanks, Brian

I assume there's a LINE directive in the file it's actually reading. Run ghci with -v to see what file it's actually trying to read.

On Sat, 26 Feb 2011 18:18:27 -0800 (PST)
Brandon Moore
I assume there's a LINE directive in the file it's actually reading. Run ghci with -v to see what file it's actually trying to read.
Here's the relevant output with -v flag: compile: input file dist/build/Bindings/HDF5.hs *** Checking old interface for main:Bindings.HDF5: [1 of 2] Compiling Bindings.HDF5 ( dist/build/Bindings/HDF5.hs, interpreted ) *** Parser: src/Bindings/HDF5.hsc:49:8: parse error on input `import' This is very weird. HDF5.hs file has LINE scattered throughout, but they are in comments: {-# LINE 15 "src/Bindings/HDF5.hsc" #-} or are they ? I assumed the purpose of this was line # annotation to let you know where the line in the .hs file comes from in the .hsc file. regardless, there is NO "LINE 49" directive, and the HDF5.hs file is blank on line 49. The first line with import (@ 165) is this : foreign import ccall "H5Dcreate2" c'H5Dcreate2 :: CInt -> CString -> CInt -> CInt -> CInt -> CInt -> CInt -> IO I'm trying to figure out if that's legal syntax. Very strange. Reporting errors on lines that don't exist makes it harder to debug :-( Brian

On Sat, 2011-02-26 at 21:36 -0800, briand@aracnet.com wrote:
[1 of 2] Compiling Bindings.HDF5 ( dist/build/Bindings/HDF5.hs, interpreted ) *** Parser:
src/Bindings/HDF5.hsc:49:8: parse error on input `import'
So it's in HDF5.hs ultimately, but LINE directives are telling it to report a different location.
HDF5.hs file has LINE scattered throughout, but they are in comments:
{-# LINE 15 "src/Bindings/HDF5.hsc" #-}
Those {-# ... #-} things are pragmas. As far as the language spec goes they are comments, but actually, compilers read them and interpret their contents. In this case, it causes the compiler to report a different location for errors.
regardless, there is NO "LINE 49" directive, and the HDF5.hs file is blank on line 49.
Line 49 of HDF5.hs doesn't matter. What's on line 49 of the hsc file? If you don't want to debug using the hsc file (which is the way this is designed), you'll have to find the LINE directive in the .hs file nearest to (but before) 49, and count lines from there. -- Chris Smith

On Sat, 26 Feb 2011 22:42:15 -0700
Chris Smith
On Sat, 2011-02-26 at 21:36 -0800, briand@aracnet.com wrote:
[1 of 2] Compiling Bindings.HDF5 ( dist/build/Bindings/HDF5.hs, interpreted ) *** Parser:
src/Bindings/HDF5.hsc:49:8: parse error on input `import'
So it's in HDF5.hs ultimately, but LINE directives are telling it to report a different location.
HDF5.hs file has LINE scattered throughout, but they are in comments:
{-# LINE 15 "src/Bindings/HDF5.hsc" #-}
Those {-# ... #-} things are pragmas. As far as the language spec goes they are comments, but actually, compilers read them and interpret their contents. In this case, it causes the compiler to report a different location for errors.
regardless, there is NO "LINE 49" directive, and the HDF5.hs file is blank on line 49.
Line 49 of HDF5.hs doesn't matter. What's on line 49 of the hsc file?
If you don't want to debug using the hsc file (which is the way this is designed), you'll have to find the LINE directive in the .hs file nearest to (but before) 49, and count lines from there.
aaaaargh ! this is needed in the .hs file generated by the .hsc. It's not good enough to put it in the source code which uses the library : {-# LANGUAGE ForeignFunctionInterface #-} what I don't understand is why the hsc processing and/or cabal build doesn't automagically handle this. maybe a ghc version thing ? I'm using 6.12.1. the binding-DSL examples do NOT use the above PRAGMA anywhere in the code. Brian

I worked out a small hdf5 binding using cabal and bindings-DSL and sqlite3 as my example.
Hi, I just wanted to add that I also started an HDF5 binding recently (using hsc2hs only). It does more than enough for me ATM, so I don't develop it actively, but if you want to pursue this (and I think it would be a useful addition to Hackage), we may share experience and code. My binding is part of a bigger project, but I meant to split it out anyway. -- Regards, Feri.

What an interesting coincidence, that makes at least three of us. Apparently it's an idea whose time has come.
Mine is also an incomplete low-level binding but is currently under semi-active development and I aim to make it cover the entire hdf5.h interface.
If anyone is interested in it I've put it on github at:
https://github.com/mokus0/bindings-hdf5
-- James
On Mar 2, 2011, at 5:12 AM, Ferenc Wagner
writes: I worked out a small hdf5 binding using cabal and bindings-DSL and sqlite3 as my example.
Hi,
I just wanted to add that I also started an HDF5 binding recently (using hsc2hs only). It does more than enough for me ATM, so I don't develop it actively, but if you want to pursue this (and I think it would be a useful addition to Hackage), we may share experience and code. My binding is part of a bigger project, but I meant to split it out anyway. -- Regards, Feri.
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

James Andrew Cook
What an interesting coincidence, that makes at least three of us. Apparently it's an idea whose time has come.
Mine is also an incomplete low-level binding but is currently under semi-active development and I aim to make it cover the entire hdf5.h interface.
If anyone is interested in it I've put it on github at: https://github.com/mokus0/bindings-hdf5
Hi, This is fairly extensive indeed! I got nowhere near this, but also took a somewhat different angle, especially by using StorableArrays for passing arrays around (I used HDF5 in conjunction with LaPack). I also experienced with going a little higher level here and there. Attributes aren't implemented yet, because that would require making location ids a type class. An unsolved problem is the safe representation of ranks: I went for generality by using lists for indexing, but it would be nice to express dimensionality constraints in the types (with sane syntax). Maybe there's a handy technique for this, I didn't explore the field. Talking about indexing, choosing Fortran convention seems to be a mistake in retrospect, but that's no big deal. I attach my code so you can get a better idea what I'm talking about, maybe you can find some usable pieces. Separating the generic hid type into specific newtypes worked out to some extent, but maybe isn't a good idea at the lowest level (where the FFI makes it automatic). I'd need broader experience with the HDF5 API to tell. -- Regards, Feri.

On Mar 4, 2011, at 8:25 AM, Ferenc Wagner wrote:
Hi,
This is fairly extensive indeed! I got nowhere near this, but also took a somewhat different angle, especially by using StorableArrays for passing arrays around (I used HDF5 in conjunction with LaPack). I also experienced with going a little higher level here and there. Attributes aren't implemented yet, because that would require making location ids a type class. An unsolved problem is the safe representation of ranks: I went for generality by using lists for indexing, but it would be nice to express dimensionality constraints in the types (with sane syntax). Maybe there's a handy technique for this, I didn't explore the field. Talking about indexing, choosing Fortran convention seems to be a mistake in retrospect, but that's no big deal.
I've skimmed through your code, it looks good. It's definitely a bit higher level than mine - mine is (intentionally) little more than a bunch of types and foreign imports with names changed minimally, mostly just uppercasing or lowercasing the "H5?" prefixes as needed. That approach seems to work really well for large C interfaces (like OpenGLRaw, etc.) if for no other reason than that it exposes the C interface via Haskell's much-better type system and documentation tools. I'll read through yours in more detail, it looks like there are some good ideas there that I can apply when I start working on something higher-level.
I attach my code so you can get a better idea what I'm talking about, maybe you can find some usable pieces. Separating the generic hid type into specific newtypes worked out to some extent, but maybe isn't a good idea at the lowest level (where the FFI makes it automatic). I'd need broader experience with the HDF5 API to tell.
My experience isn't especially broad either, but from what I've seen a type-level approach for distinguishing hid_t usages ought to work, at least most of the time. One thought I have is to use a phantom type parameter at the lowest level, so that foreign imports can either constrain it or not, as the situation seems to call for. Thanks -- James

I worked out a small hdf5 binding using cabal and bindings-DSL and sqlite3 as my example.
I just wanted to add that I also started an HDF5 binding recently (using hsc2hs only). It does more than enough for me ATM, so I don't develop it actively, but if you want to pursue this (and I think it would be a useful addition to Hackage), we may share experience and code. My binding is part of a bigger project, but I meant to split it out anyway.
What an interesting coincidence, that makes at least three of us. Apparently it's an idea whose time has come. Mine is also an incomplete low-level binding but is currently under semi-active development and I aim to make it cover the entire hdf5.h interface. If anyone is interested in it I've put it on github at:
Bindings to the full hdf5 were supposed to be in the example set for bindings-DSL. It doesn't use pkg-config, though, and hdf5 developers didn't like the idea of adding support. I wanted reference bindings-* libraries to be free of linking problems some users might not be able to solve or understand, so I gave up. Best, Maurício

On Apr 3, 2011, at 10:43 PM, mauricio.antunes@gmail.com wrote:
I worked out a small hdf5 binding using cabal and bindings-DSL and sqlite3 as my example.
I just wanted to add that I also started an HDF5 binding recently (using hsc2hs only). It does more than enough for me ATM, so I don't develop it actively, but if you want to pursue this (and I think it would be a useful addition to Hackage), we may share experience and code. My binding is part of a bigger project, but I meant to split it out anyway.
What an interesting coincidence, that makes at least three of us. Apparently it's an idea whose time has come. Mine is also an incomplete low-level binding but is currently under semi-active development and I aim to make it cover the entire hdf5.h interface. If anyone is interested in it I've put it on github at:
Bindings to the full hdf5 were supposed to be in the example set for bindings-DSL. It doesn't use pkg-config, though, and hdf5 developers didn't like the idea of adding support. I wanted reference bindings-* libraries to be free of linking problems some users might not be able to solve or understand, so I gave up.
That seems strange to me - pkg-config is such a useful system, and "support" for it is incredibly easy to add and practically zero- maintenance. Was it that they didn't find it worth the programmer time to figure out how to add pkg-config support or did they have specific objections? All it seems to take is to generate a file with about 10 lines of text and install it to the right place. In any case, though, the fact that current versions doesn't support it means that a Haskell binding package has to work around that for several years to come, since most "stable" linux distros wouldn't pick up an updated version for quite some time. Currently I've got a "template" hdf5.pc file in the source tree which can be customized and dropped into the appropriate directory. It's a lot less manual than it ought to be, but it's at least a lot less ugly than hard-coding my development machine's include and lib paths. Eventually my plan is to use a cabal flag to control whether it looks for hdf5 as using "pkgconfig-depends" or just "extra-libs", with some Setup.hs logic to check whether the former will work and set the flag appropriately. Incidentally, I've thought many times before that it sure would be great if cabal's backtracking search would consider more than just haskell package dependencies. In this case, it would be really nice if it would backtrack on an unsatisfiable pkg-config dependency. In the past I've come across cases where it would have been very helpful to support backtracking on "buildable: False". Maybe I'll take a look one of these days at how hard that would be to change. I suspect that to do so in a backward-compatible way would take a lot of work though, because of the way Cabal exposes its internal types to Setup.hs files. -- James

Bindings to the full hdf5 were supposed to be in the example set for bindings-DSL. It doesn't use pkg-config, though, and hdf5 developers didn't like the idea of adding support. [...]
That seems strange to me - pkg-config is such a useful system, and "support" for it is incredibly easy to add and practically zero- maintenance. [...]
I have to start asking for forgiveness. In fact, PETSc (http://www.mcs.anl.gov/petsc) developers refused pkg-config, not HDF5. I investigated several libraries for numerical and massive data processing, and most didn't had pkg-config support. I started asking about it at pkg-scicomp-devel debian list, and then Tollef (maintainer of pkg-config): http://lists.alioth.debian.org/pipermail/pkg-scicomp-devel/2009-September/00... http://thread.gmane.org/gmane.comp.package-management.pkg-config/345 After learning from them that pkg-config files belong to upstream packages, not distribution packages, I asked PETSc guys. Unfortunately, their petsc-maint mainling list doesn't seem to have an external archive. But Matthew Knepley, from PETSc, said that "we do not use that system. It's unreliable, and in my opinion, one of the worst designs possible." They (Matthew and Satish Balay) suggested that we "use the makefile targets which give the include and library information." I'm not sure how practical that is with cabal, but I gave up before trying harder and decided to focus on packages with pkg-config already available. I think 'hmatrix' package on hackage uses a script to check for dependencies, but I didn't read enough to check how portable that method can be.
Currently I've got a "template" hdf5.pc file in the source tree which can be customized and dropped into the appropriate directory. It's a lot less manual than it ought to be, but it's at least a lot less ugly than hard-coding my development machine's include and lib paths.
Maybe it's worth contacting HDF5 guys about that. Best, Maurício
participants (7)
-
Brandon Moore
-
briand@aracnet.com
-
Chris Smith
-
Ferenc Wagner
-
James Andrew Cook
-
James Cook
-
mauricio.antunes@gmail.com