H2 + Cookie header => splitted header

Hello there ! It seems that HAProxy splits Cookie headers.

Example:

Input headers:
cookie: cookie_a=content_a ; cookie_b=content_b

Output headers:
cookie: cookie_a=content_a
cookie: cookie_b=content_b

Am I right ?

I saw that with tcpdump + wireshark, but I am no 100% sure that HAProxy received only one cookie header because I don’t know how to read https content with wireshark (not yet).

If HAProxy splits cookie header, it could be great if it was optional. Because some applications don’t underestand multi cookie headers.

EDIT:

My backend is a HTTP 1.1 backend and I saw this : https://http2.github.io/http2-spec/#CompressCookie
I quote : "If there are multiple Cookie header fields after decompression, these MUST be concatenated into a single octet string using the two-octet delimiter of 0x3B, 0x20 (the ASCII string "; ") before being passed into a non-HTTP/2 context, such as an HTTP/1.1 connection, or a generic HTTP server application. "

EDIT2:

Maybe it’s not HAProxy, but my browser who is splitting cookie headers (chromium & firefox)

Yes, however, according to the specification you quoted above, it would be haproxy’s job to concatenate them into a single string when passing it to a HTTP/1.1 backend.

I guess we are not doing that, @willy ?

Hi,

HAProxy is not supposed to do that, I don’t think you can even do this by
configuration.
Furthermore, the character ‘;’ is not a header separator…

Could you share your configuration?

Baptiste

confirmed, it’s not able to do this, you have to continue to search :slight_smile:

However what is possible and even likely is that the browser indeed sends them separately as two independent HTTP/2 headers to favor compression. And we don’t merge them either here. That’s something the mux should take care of. Adding this to the todo list.

I outlined and checked every MUST/MUST NOT in the spec to track what is done/left to be done, but somehow I seem to have missed this one.

If you’re interested, you can use H2C in “wiretap” mode. It will listen on one port and forward to another, showing the H2 contents in the middle. It’s quite convenient to see who does what.

1 Like

Thank you all for your responses.
I can’t use H2C because my backend does not support h2 (it’s not an emergency, I will wait). Thank you for adding this to the todo list.

sorry, I meant h2c between the browser and haproxy :slight_smile:

So I checked how to do this and it’s too complicated for 1.8, as it requires to change the HPACK decoder to produce a first list of headers and then to perform a second pass to deduplicate the Cookie header fields and emit the HTTP/1 headers. Given that we intend to go with a native internal format for 1.9, this will be done as part of this. The current situation should be totally harmless for servers given that some products can already add cookies by adding headers and some people already do it with haproxy. It could make a difference if a very high number of cookies is emitted and it reaches the server’s limit in number of headers however.

Looking at a way around this in 1.8 we have written a LUA script that combines cookie headers before sending to the backends. Our rough proof of concept is working, but is there any reason why we shouldn’t do it? Additionally we are only combining the cookie headers as these were the ones causing issues, would there be any benefits/issues to combing other header types?

Also I am not sure if this a bug or not but the negative operators don’t seem to be working when sending a request to a lua script.

In the example below both http-request conditions are ignored and all requests are sent to the LUA script, If I remove the ! or change the unless to if only the requests that match the ACL are sent as expected. Changing lua.fix-headers to deny, everything works as expected so I know my syntax is correct, its just when using lua.

# ACLs FOR STATIC CONTENT
acl static_content path_end .jpg .gif .png .css .js .htm .html .map .woff .svg .woff2

http-request lua.fix-headers if !static_content
http-request lua.fix-headers unless static_content

Version Info

[root@haproxy-18 ~]# haproxy -vv
HA-Proxy version 1.8-rc3-34650d5 2017/11/11
Copyright 2000-2017 Willy Tarreau <willy@haproxy.org>

Build options :
  TARGET  = linux2628
  CPU     = x86_64
  CC      = gcc
  CFLAGS  = -g -fno-strict-aliasing -Wdeclaration-after-statement -fwrapv -Wno-unused-label
  OPTIONS = USE_ZLIB=1 USE_OPENSSL=1 USE_LUA=1 USE_PCRE=1

Default settings :
  maxconn = 2000, bufsize = 16384, maxrewrite = 1024, maxpollevents = 200

Built with OpenSSL version : OpenSSL 1.0.2k-fips  26 Jan 2017
Running on OpenSSL version : OpenSSL 1.0.2k-fips  26 Jan 2017
OpenSSL library supports TLS extensions : yes
OpenSSL library supports SNI : yes
OpenSSL library supports : SSLv3 TLSv1.0 TLSv1.1 TLSv1.2
Built with Lua version : Lua 5.3.4
Built with transparent proxy support using: IP_TRANSPARENT IPV6_TRANSPARENT IP_FREEBIND
Built with network namespace support.
Built with zlib version : 1.2.7
Running on zlib version : 1.2.7
Compression algorithms supported : identity("identity"), deflate("deflate"), raw-deflate("deflate"), gzip("gzip")
Encrypted password support via crypt(3): yes
Built with PCRE version : 8.32 2012-11-30
Running on PCRE version : 8.32 2012-11-30
PCRE library supports JIT : no (USE_PCRE_JIT not set)
Built with multi-threading support.

Available polling systems :
      epoll : pref=300,  test result OK
       poll : pref=200,  test result OK
     select : pref=150,  test result OK
Total: 3 (3 usable), will use epoll.

Available filters :
	[SPOE] spoe
	[COMP] compression
	[TRACE] trace

I was thinking about a fix with LUA as well. I don’t think it has a huge performance impact, maybe use the concat class [1] instead of nativa lua concatenation, but other than that, I think we should be good.

Not sure about the negative condition, we will have to look into it. This only affects cookies headers and I would recommend not to touch any other headers.

I would also condition the LUA script based on the amount of Cookies headers, adding an ACL requirement like:
req.fhdr_cnt(cookie) gt 1

Can you share the script you have currently?

[1] https://www.arpalert.org/src/haproxy-lua-api/1.8dev/index.html#concat-class

Script below. If we use the req.fhdr_cnt(cookies) gt 1 ACL we should be able to get rid of our check to make sure a cookie header exists, and make it a two line script. Being new to LUA please point out any mistakes or missed best practices.

core.register_action("fix-headers",{ "http-req" }, function(transaction)
    --[[
        transaction is of class TXN.
        TXN contains contains a property 'http' which is an instance
        of HAProxy HTTP class
        We iterate through the headers, to see if any 'cookie' header exists
        If found, we re-set the cookie header array to a single cookie
        containing ALL cookies concatenated with ";"
        NB: Misses the last cookie unless table start and end are specified.
    ]]

    local hdr = transaction.http:req_get_headers()

    for key, _ in pairs(hdr) do
        if key == "cookie" then
            set_cookie_header = true
        end
    end

    if set_cookie_header == true then
        --[[ Overwrite cookie headers with single header containing all cookies ]]
        transaction.http:req_set_header("cookie",table.concat(hdr["cookie"],';',0,#hdr["cookie"]))
    end
end)
1 Like

This works great, thank you!

A few things:

  • my ACL above was wrong in that I wrote “cookies” instead of “cookie”, I fixed it in that post already
  • I can confirm that with my ACL above, your script can be trimmed down to basically two lines, as you already said
  • I do think we can just use the “req.fhdr_cnt(cookie) gt 1” condition, LUA should be fast enough even if static files run through this as well
  • to be perfectly RFC compliant, I added a space after ; - “two-octet delimiter of 0x3B, 0x20”

I think your script will be very helpful, do you want to post it in a new thread to the LUA section, maybe rename it something more specific like fix-http2-cookies (and including instructions how to load it)?

I wrote a short standalone reproducer in PHP that sets 4 cookies to the current timestamp and then requests the browser to reload the page after 2 seconds, unless the cookies that it sees are incomplete/wrong. This shows the problem right away (in Chrome and Firefox):

The scripts runs at:
http://abrowserhasnocookie.ltri.eu/

code:
http://abrowserhasnocookie.ltri.eu/source.txt

Haproxy backend to reproduce this:

backend bk_testbk
 http-request set-header Host abrowserhasnocookie.ltri.eu
 server www abrowserhasnocookie.ltri.eu:80

Use HTTP2 to trigger the problem, fix it with Adrian’s LUA script:

global
 lua-load /home/lukas/fix-headers.lua
frontend https
 http-request lua.fix-headers if { req.fhdr_cnt(cookie) gt 1 }

@willy some backend applications may support this, but I doubt that most of them do - otherwise we would not have 2 reports here already in the -rc phase; I think the breakage because of this is quite extensive. Also see the simple php reproducer above.

I’m not saying we have to do this in 1.8, but should warn users about this when we release 1.8 (release notes and documentation). Adrian’s LUA script looks to me like a very good workaround and can be used in 1.8 to keep using HTTP2 while we don’t concatenate the cookie headers yet. Actually this script is a poster child for LUA usage :smile:

Hi Lukas,

Indeed, for me this failure to parse multiple cookie headers is really new but I totally trust reporters (as usual). I was mostly interested in knowing in what situation this was encountered but you already provided one. Yes I do think we’ll have to fix it, but indeed I’d rather see if we find a reasonable way to do it without having to harm the H2 transcoder too much as it needs to remain as stateless as possible due to the fact that it affects the HPACK dictionnary and that we’re not allowed to pause nor fail or retry there. We could also consider doing it in H1 for now, and move that to the future “native” version-agnostic HTTP representation later. H2 is still experimental so I’d also prefer to release as-is for now and mention this limitation in the doc as you suggest.

Thanks!

Thanks for the feedback. We tested with back ends running apache 2.2, 2.4, and nginx and all displayed the same problem so I assume it will be a fairly widespread issue.

I created the topic in the LUA section. Fix HTTP2 Cookies

While not required, in our environment I have also set HAProxy to delete the cookies on static files before they are checked to stop those requests being sent to LUA. I have no idea performance wise if it is better but I figure the more I can do in HAProxy the better. On our websites it drops the number of requests that LUA needs to process down from 32 to five.

# ACLs FOR STATIC CONTENT
acl static_content path_end .jpg .gif .png .css .js .htm .html .map .woff .svg .woff2
http-request del-header cookie if static_content

http-request lua.fix-http2-cookies if { req.fhdr_cnt(cookie) gt 1 }

Hi guys,

so I finally fixed it in the code. I was convinced that the amount of trouble was not worth saving one extra day of work to fix the parser and as a result it’s significantly cleaner as HPACK doesn’t know HTTP/1 anymore now.

It’s available in the latest git with this commit :

2fb986c (“BUG/MEDIUM: h2: always reassemble the Cookie request header field”)

Lukas, your test case was very useful and helpful. Now my browser correctly says :

Expecting cookie value: 1511295387 (from GET parameter)
OK: HAPTESTa matched expection (1511295387)
OK: HAPTESTb matched expection (1511295387)
OK: HAPTESTc matched expection (1511295387)
OK: HAPTESTd matched expection (1511295387)

Guys, thank you so much for your valuable feedback on this. Please stress it, but I don’t really see what wrong could happen now. Just keep in mind that given that each cookie may be sent in a different header, the number of cookies over H2 is subject to the max headers count (100 by default). So I doubled this during early H2 decoding to be able to store more cookies, hoping that 99 headers plus 101 cookies can be folded into 100 headers. I think that should be more than enough.

1 Like

Thank you for the fix! Constantly amazed at the work done by yourself and other contributors.

I can confirm in 1.8-rc4-5438183 the issue is resolved for our use case.

1 Like

That’s the power of a strong community :slight_smile: It allows the work to be parallelized very well and each of us can focus on the things we bring the best value to! And by the way, thank you for testing!

Great, thanks! I can confirm it does work for me too.

The next H2 issue report is already on the mailing list though :wink:

Indeed, the problem seems to be fixed in 1.9.0dev0.

Thank you very much !