
At Mon, 19 Nov 2007 23:18:58 -0200, Felipe Lessa wrote:
If there aren't any libraries available, are there any suggestions on how to create one?
I am not sure if a better way exists already, but here is how I would do it if nothing better exists. Creating a binding to the gettext library calls should be pretty straight forward; I have done it in ocaml before, and I think there are only two or three functions you need.
Would it be too much job for someone without experience with gettext to write bindings and create and extractor like xgettext (or modify it)?
Instead of TH, it might be easier to use: http://www.haskell.org/ghc/docs/latest/html/libraries/haskell-src-1.0.1.1/La... first you would define a function like: i18n :: String -> String i18n = id which you will use to markup strings that should be translated. It does not actually do anything, and will be optimized out by the compiler. Also, it means that your code will be portable to hugs, yhc, etc, which do not have TH. You would then use Language.Haskell.Parser to parse your source files and look for the i18n function. That function should only ever be applied to string literals. If you find that it is applied to something else, that should trigger an error. Otherwise you just find all the strings after i18n, and write them out to a .pot file. You might also allow: i18n ("multine" ++ "string") provided that all the arguments of ++ are string literals. This is because, AFAIK, there is no other way to split a long string across multiple source lines in a portable way. j. ps. I have never used Language.Haskell.Parser, so I might be imagining it does something that it does not.