
Hello, Is there a reason why we have to escape the character ' (apostrophe) when used in a character literal? For example, we have to write '\'' instead of '''. (With syntax highlighting, the second is a lot better looking than the first.) It seems that in this case we do not need the escape because a literal contains exactly one character. If there is no good reason for having the scape, I think that we should remove this restriction in Haskell'. -Iavor

Iavor Diatchki wrote:
Hello, Is there a reason why we have to escape the character ' (apostrophe) when used in a character literal? For example, we have to write '\'' instead of '''. (With syntax highlighting, the second is a lot better looking than the first.) It seems that in this case we do not need the escape because a literal contains exactly one character. If there is no good reason for having the scape, I think that we should remove this restriction in Haskell'. -Iavor
If you really want you can also allow an unescapped backslash, '\'. The rules for parsing character literals would become something like: case input of ''' -> a single quote '\'' -> a single quote '\' -> a backslash '\..' -> an escape 'a' -> a normal character I'm not saying this is a good idea, just that it is possible. :) Twan

Hi
Is there a reason why we have to escape the character ' (apostrophe) when used in a character literal? For example, we have to write '\'' instead of '''.
This is what Ada does. I think that's a reason enough to keep things the way they are. It does actually make syntax hilighting more complex, and introduces another special case. It's also no longer "the same" as strings, since """ doesn't mean "\"". Thanks Neil

Hello,
It does actually make syntax hilighting more complex, and introduces another special case.
How is that another special case? If anything, it seems to be removing a special case because there is no need for an escape of the form \'. Do you have a concrete example of what it complicates?
It's also no longer "the same" as strings, since """ doesn't mean "\"".
Fair enough, but it is easy to explain why these are different. Can we explain why we need to escape the quote in character literals? So far we have: (i) we want to be different from Ada, and (ii) we want to do the same as C and Java. A nice property of the small change that I proposed is that to switch between a singleton string and a character we just need to change the quotes. With the current notation we have to also add a backslash if the string happened to be "'", which seems redundant. Anyways, this is not a big thing but it seems like a wrinkle that could be fixed. -Iavor
participants (4)
-
Ashley Yakeley
-
Iavor Diatchki
-
Neil Mitchell
-
Twan van Laarhoven