
Daniel Fischer wrote:
So back to square one. What then _is_ run-time compilation?
In the virtual machine community, run-time compilation refers to the translation of program code at run-time, for example the compilation of Java byte code to machine code in a JIT (just-in-time) compiler. Other uses include binary dynamic translators (e.g. for running IA-32 binaries on Alpha processors), binary dynamic optimizers or security monitors for binary programs. Actually, I work on dynamic analysis and transformation of functional languages at the moment and was quite surprised how this term is used in the Haskell Wiki. I would rather call it pre-computing or something (but maybe I misunderstood the RunTimeCompilation page on my first reading).
Bulat said that in
n <- readIO flip mapM [1 .. 100] $ \x -> print (x^n)
the programme could insert (if, e.g. the value read is 2) the concrete faster algorithm for x^n. Isn't that some sort of run-time code generation? And a) how far might one expect that sort of thing done,
In a prototype implementation (or rather a proof-of-concept) I have done exactly that: compile the code of a lambda-expression at the time the closure is constructed and inline the values of all enclosing lexical variables. Of course, an optimizer is needed which uses the additional information about data values, for example by constant-folding, performing strength-reduction, etc. I have only done very primitive optimizations so far. The biggest problem is probably to limit code generation to the cases where a performance gain is expected.
b) how could one cajole the system to do that.
There are two options I can see: - Operate on an intermediate representation of the program and interpret it, or - compile the program to machine code. The former method would probably loose most of the performance expected from specialization, and the latter is difficult to integrate into existing systems (the run-time system and especially the garbage collector will need to know about dynamically generated code and how to properly handle them - you don't want to throw code away to which a pointer still exists on the run-time stack, for example :-)
Last, reverting to the search/replace example, if I have the general algorithm and also declare the function
work :: String -> String work = searchReplace "pattern" "substitution",
the compiler would produce specialised object code for that, or wouldn't it?
Yes, this is just a more complicated example of the code above, which could specialize x^n. The compiler would have to know how to work with strings, of course. If you're interested in that kind of stuff, you might want to read in the direction of "virtual machines", "jit compilation", "partial evaluation" and "meta programming". Cheers, Martin