
Oops - didn't reply to the list!
So the other problem is that it only makes a .h file, not a .c file. Really you want to be able to inject things into the header that c2hs reads, and separately into a .c file. In a Cabal build you'd then want Cabal to compile the .c file and include it into the result. For your project you obviously have to handle the .c file yourself.
Yep - none of this will get polished to a Cabal level and none of it would be easy to polish.
Though actually that might not help for the kernel headers because I think they do not declare the regparam calling convention and just rely on gcc flags when compiling.
You are correct - the call convention has to be declared manually. There is no reasonable way for c2hs to learn it.
In the end the solution I have is to manually write the above C section and the foreign import call using a regparm3 calling convention. This isn't to say c2hs isn't used - its still hugely helpful (so long as the headers remain easy to convert to ANSI C), but just my quick experience.
BTW, what do you mean about ANSI C?
I meant there exists at least one aspect (perhaps a bug) to the kernel
headers that cause an error when c2hs tries to parse the headers. In
the FC11 2.6.30 headers they use an enum when declaring an extern
function prototype - without declaring the enum (timers.h). The
solution is to #include
So yes, I think all these issues are solvable. As usual the difficulty is finding enough people with enough time to actually do the work. I think c2hs could become the standard Haskell ffi binding tool, taking over from hsc2hs, but it needs a bit of love.
FWIW, hsc2hs is entirely unusable for this work as it builds a .c file that will generate the .hs file. I couldn't get this to work for kernel bindings as one build (the generating .c file) needs things like stdlibs while the other (the kernel) absolutely can not have such things. All sorts of conflicts occur, such as redefinition of basic type by the kernel headers. Thomas