Although this can be considered a definition of the language, it is by no means a formal definition of the static and dynamic semantics of the language in a form amenable to representation in systems like Coq, where metatheory might be checked.
Nice talk; thanks for the reference. To summarize: formalization, which is a lot of work, leads to standardization and the reluctance to change the language, and one of Haskell's strong points is that it is constantly evolving.
I see SPJ's point, but he makes another point in the same video, which is that Haskell has a small, internal core language to which everything must elaborate, so this would seem to make formalization even more attractive: formalize and prove safety for the core language once and for all, and then specify the elaborations and check their properties modularly, which should work well even in the face of language evolution.