
The JVM is also doing something similar. There's a step in the process of loading the .class files called "bytecode verification". Ιt's basically a type checker for the stack language used in the bytecode. This was problematic in Java before version 7, because it was possible to cause denial-of-service attacks by sending malicious bytecode payloads. The verifier would spend O(n^2) time verifying that the bytecode behaves correctly. As of Java 7, compilers are required to add additional info alongside the bytecode instructions. This extra info is called a stackmap table and allows the verifier to complete its verification in O(n) steps, IIRC. So, a certain instantiation of PPC is already used in the wild. On 07/10/2018 12:00, Barak A. Pearlmutter wrote:
I would guess a factor that prevented something like PCC from taking off is the fact that proofs over binaries are relatively pretty challenging to express and verify, and ISA semantics tend to be really complicated.
The Cedilla folks had a pretty clever way of dealing with that issue. They had a very small proof checker on the embedded device, which used highly compressed information associated with the binary to basically guide a theorem prover, where the information helped the prover make all its choices.
This information was generated from a proof generated by a compiler, which compiled a high-level language (C in their case) annotated with extra safety-proof-related information down to both machine code an a proof of that machine code's safety. So they sort of did what you're describing, but partially evaluated away the IR. _______________________________________________ Haskell-Cafe mailing list To (un)subscribe, modify options or view archives go to: http://mail.haskell.org/cgi-bin/mailman/listinfo/haskell-cafe Only members subscribed via the mailman list are allowed to post.
-- Ionuț G. Stan | http://igstan.ro | http://bucharestfp.ro