Error 520 - identified the cause but impossible to fix!

Hello everyone!

I’ve noticed that Cloudflare returns “Error 520 - Web server is returning an unknown error” when my cookie header’s size is about 4Kb. Which is strange, since the official limit is 16Kb. Other headers are standard, minuscule ones, so they don’t contribute too much to the overall headers’ size.

It’s weird that Cloudflare doesn’t even notice you about it, I’ve stumbled upon this just randomly. And the server error logs are empty, because obviously there are no errors from the server’s perspective.

But more importantly - it’s now impossible to reach the website for some users! I’ve implemented some code to make the cookies’ size smaller, but those users, whose cookies have already exceeded this limit can’t access the website to have their cookies reorganized! They basically have to clear their cookies, which the majority of people won’t think of when they see the error screen!

So a lot of questions here:

  1. First and foremost: what to do with the users who are effectively banned from the website?
  2. Why does this happen even though the headers’ size is nowhere near 16Kb?
  3. Is there a way to log this behavior somehow to be aware of such situations?

Investigate whether your origin web server silently rejects requests when the cookie is too big.
I believe nginx’ max header size is 8kb, so if you’re using that on your server you might want to double-check that limit and modify if needed.
Alternatively, deploy a Cloudflare Worker which intercepts requests to your origin web server, and use it to trim or modify the user’s cookies to get it below 4kb.

The only thing I could think of at the moment is that your web server is silently dropping requests when the cookie size is too big. Try to deploy a different webserver on a subdomain that’s not being used, and check if that one also rejects requests when the user has large cookies.

Cloudflare does have Origin Service Level Monitoring, which will notify you when an increase in server errors is being observed, but unfortunately, that is only limited to enterprise customers at the moment.
I’m not aware of another method to be automatically notified of situations like this.

1 Like

I’d second this - I’ve had Cookies over 8KB go through Cloudflare just fine and only caused issues when NGINX was rejecting them due to the default limit of 8192.

Thanks, guys!
This was indeed the issue! For future reference: those are the settings that solved it (NGINX):
http2_max_header_size 32k;
http2_max_field_size 16k;
The default limit for one header is actually 4Kb in NGINX. So raising it explicitly helped. This is indeed weird that no errors have been written to the error log.

1 Like

It should be 8K by default - weird that yours was 4K

It took a few days to debug it when I ran into it since we have Varnish infront of NGINX which also needed raising.

1 Like

Hm, I guess it has something to do with HTTP2: can’t include a link to the docs (new account), but the trick directive is http2_max_field_size which has a default value of 4Kb.

The docs say that those directives are obsolete, but it did the trick nonetheless. I mean those directives were not explicitly set, so the server was operating with defaults on those, and “large_client_header_buffers” wasn’t taken into account for some reason.

So yeah, this was indeed a pain in the butt :slight_smile:

1 Like

Cloudflare was experimenting with HTTP/2 to origin and I believe it’s now on by default for Free/Pro/Business - so that’d probably explain why it’s started happening (if that’s the case)

Yeah, it seems to be exactly the case.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.