One compromise GHC could make would be abstract the notion of
module from the Haskell source (multiple modules per file, no file
involved at all), but ultimately as Brandon says, the file system
has to get involved for the linker, so the concept of a module
needs to remain compatible with the action of writing it as a
file.
As long as we're using system tools for assembling and linking, we must abide by their constraints. Not using them isn't really viable; linking in particular is a nightmare. The bytecode interpreter has a partial linker because it can't use the system one, and it's easily the nastiest part of ghc. It's also completely nonportable, by definition: every target has its own notion of what relocations are available and how they work.
In a recent project that compiles Haskell source from data (ie of type Text from the module Data.Text), it would be useful to be able to decouple the dependency between GHC’s notion of where modules are and the file system. This doesn’t seem to be programmatically controllable.
How tenable is this? Would it be useful for anyone else to have compilation itself be more first class in the language? If I think about languages such as LISP/Racket/Clojure, there’s a certain flexibility there that Haskell lacks, but it’s not apparent why, other than historical reasons? Would this imply changing compiling and linking into a different Monad than IO?
At the moment to compile some source that exists in Text, my system has to write a bunch of temp files including the Text that contains the main module, and then put other modules in directories named a certain way, run the compiler across them via some exec command to call GHC or stack externally, then read the resulting executable back off disk to store it in its final destination.
It might be useful to be able to do this from within Haskell code directly, partly similarly to how the hint library works. Though, in this case it would almost certainly also require being able to have two versions of GHC loaded at once, which would also imply being able to simultaneously have multiple or different versions of libraries loaded at once, too, and possibly also just from data, ie not from disk. It feels like a massive, massive project at that point, though, like we’d be putting an entire dependency system into a first-class programmable context. I’m still interested in what folks think about these ideas, though, event though we this may never eventuate.
Does it seem to anyone else like abstracting the library and module-access capabilities of compilation so that it’s polymorphic over where it gets its data from might be useful? Is this just ridiculous? Does this step into Backpack's territory? From memory, the Haskell report doesn’t specify that modules necessarily need to be tied to the file system, but I think GHC imposes one file per module and that it be one the FS.
Julian
_______________________________________________
Haskell-Cafe mailing list
To (un)subscribe, modify options or view archives go to:
http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe
Only members subscribed via the mailman list are allowed to post.
--
brandon s allbery kf8nh
_______________________________________________ Haskell-Cafe mailing list To (un)subscribe, modify options or view archives go to: http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe Only members subscribed via the mailman list are allowed to post.
-- Hécate ✨ 🐦: @TechnoEmpress IRC: Hecate WWW: https://glitchbra.in RUN: BSD