We have a pretty odd issue I’ve never run into before.
If I click on “some” of the links for our website through a search engine (google, bing) I get a 503 error, but if I go to the url directly the site works correctly.
Through webmaster tools, google does not flag the error the last time they crawled it.
Example.
Here’s the url to the article: https://www.fiercebiotech.com/biotech/fda-conflicted-lilly-and-incyte-s-refiled-baricitinib
When you load the above in your browser it loads ok.
The article is the topi link when the article is discovered through keywords:
https://www.google.com/search?ei=50_IXNHgGeKW_QbV3K3oCw&q=baricitinib+Lilly+fiercebiotech&oq=baricitinib+Lilly+fiercebiotech&gs_l=psy-ab.3...6777.9354..9535...0.0..0.123.1147.11j2......0....1..gws-wiz.......0i71j35i39j0i22i30j35i304i39j33i299j33i160.nz4U40lQFUo
When you click on the link through the search engine you get a 503 response.
These are the response headers when we get a 503:
accept-ranges: bytes
age: 0
cf-ray: 4cfae6061c27d36a-LAX
content-type: text/html
date: Tue, 30 Apr 2019 16:30:08 GMT
expect-ct: max-age=604800, report-uri=“…removed due to new user limit…”
last-modified: Thu, 20 Dec 2018 01:11:29 GMT
server: Cloudflare
status: 503
via: varnish
x-cache: MISS
Things we’ve tried:
- putting the domain in development mode
- clearing the application, varnish and Cloudflare caches multiple times (in that order)
Anybody run into this kind of an issue before? Any ideas?