
So per a suggestion from sjanssen, I was refactoring my XSelection.hs to optionally use the 'decode' from utf8-string. (When I originally was writing it, utf8-string wasn't even optionally available, so I had copied in the decode definition.) I finished editing, and everything looked dandy, but everytime I compiled, or loaded into GHCi (with :set -DUTF8), it fails to compile! And it fails to compile in a way that really perplexes me: Could not find module `Codec.Binary': Use -v to see a list of the files searched for. Line 34 reads: import Codec.Binary.UTF8.String (decode) Note that the two lines disagree on what is being imported... I tried reinstalling utf8-string and X11-xft, thinking perhaps that was the problem, but that is not it. I can load utf8-string modules fine in GHCi; I can swap out that import line for 'import Codec.Binary.Anythingelse', but the moment I use .Utf8.*, it fails. And I've looked over my changes several times, but it looks to me to be the same as the CPP usage in Fonts.hsc, for example; and if -DUTF8 isn't set, it works fine. I am a little stumped. I can't darcs send it because I don't know if the code is broken or not - it could just be my system. Find the patch attached. -- gwern Bluebird 5707 Kosovo Zemin XM Guppy Internet NVD ABC SGI