Re: RFC: A standardized interface between web servers and applications or frameworks (ala WSGI)

On Sun, Apr 13, 2008 at 6:32 PM, Chris Smith
Does old code that handled these headers stop working, just because it was looking in the "other" section, but now needs to check a field dedicated to that header?
Yes, but it would be very sad if we couldn't do common header parsing because of this. I'd suggest that all the headers given in RFC 2616 be parsed and nothing else. That leaves the question of how we would handle the addition of any extra ones in the future. Firstly, packages could depend on a given version of this interface and we declare that the set of handled headers doesn't change within a major version. Better would be some static assertion that the interface doesn't handle some set of headers. Maybe there's a type trick to do this, but I can't think of one, so we might have to settle for a non static: checkUnparsedHeaders :: [String] -> IO () Which can be put in 'main' (or equivalent) and can call error if there's a mismatch. AGL -- Adam Langley agl@imperialviolet.org http://www.imperialviolet.org

On Mon, Apr 14, 2008 at 3:27 AM, Adam Langley
On Sun, Apr 13, 2008 at 6:32 PM, Chris Smith
wrote: Does old code that handled these headers stop working, just because it was looking in the "other" section, but now needs to check a field dedicated to that header?
Yes, but it would be very sad if we couldn't do common header parsing because of this.
I'd suggest that all the headers given in RFC 2616 be parsed and nothing else.
Both request and response accept any entity headers and 7.1 (of RFC 2616) says that a valid entity header is an extension header, which can be any kind of header.
That leaves the question of how we would handle the addition of any extra ones in the future. Firstly, packages could depend on a given version of this interface and we declare that the set of handled headers doesn't change within a major version.
Better would be some static assertion that the interface doesn't handle some set of headers. Maybe there's a type trick to do this, but I can't think of one, so we might have to settle for a non static:
checkUnparsedHeaders :: [String] -> IO ()
Which can be put in 'main' (or equivalent) and can call error if there's a mismatch.
Most of the times a Header makes sense in some scenarios and doesn't in others, so a package level checking is too coarse grained. IMHO it would be better to create a two layered approach. The bottom layer handles the request as a bunch of strings, just checks for structural correctness (i.e. break the headers by line and such) without checking if the headers are correct. The top layer provides a bunch of parser combinators to validate, parse and sanitize the request so a library can create its own contract: newtype Contract e a = Contract (HttpRequest -> e a) contract :: Contract Maybe MyRequest contract = do pragma <- parseHeader "Pragma" (\header -> ...) ... return $ MyRequest pragma ... main = do request <- readHttpRequest sanitized <- enforce contract request ... Such approach would be more flexible and extensible. Later other packages could provide specialized combinators for other RFCs. HTTP is regularly extended, in RFCs and by private parties experimenting before writing an RFC, it would be bad if the primary Haskell library for HTTP didn't support this behavior. Also it's important to notice that the HTTP spec defines things to be mostly orthogonal, so most of the headers stand on their own and can be used in combination with many methods and other headers, every once in a while someone finds a combination that makes sense and wasn't thought of before.
AGL
-- Adam Langley agl@imperialviolet.org http://www.imperialviolet.org
Best regards, Daniel Yokomizo.

On Mon, Apr 14, 2008 at 4:54 AM, Daniel Yokomizo
Both request and response accept any entity headers and 7.1 (of RFC 2616) says that a valid entity header is an extension header, which can be any kind of header.
Is wasn't suggesting that other headers be dropped, just that they remain as strings.
IMHO it would be better to create a two layered approach. The bottom layer handles the request as a bunch of strings, just checks for structural correctness (i.e. break the headers by line and such) without checking if the headers are correct. The top layer provides a bunch of parser combinators to validate, parse and sanitize the request so a library can create its own contract:
Ok, I think I'm convinced by this argument. I'd hope that a standard set of header parsers be defined, and that an application which only cares about 2616 headers can do call a single function to parse them all, but I no longer advocate that the base interface use parsed forms of headers. Also, parsing URLs seems to be pretty uncontroversial (maybe parsing key, value pairs from the path, maybe not) AGL -- Adam Langley agl@imperialviolet.org http://www.imperialviolet.org
participants (2)
-
Adam Langley
-
Daniel Yokomizo