Over-caching pages in Chrome and I bypassing the cache doesn't work on page rules

Here are my pages that I’m having trouble with and I’m currently running the entire FQDN through CF with no other trouble:

https://rossirovetti.com/price/15/50/product.html
https://rossirovetti.com/price/50/70/product.html
https://rossirovetti.com/price/70/90/product.html
https://rossirovetti.com/price/90/110/product.html
https://rossirovetti.com/price/110/150/product.html
https://rossirovetti.com/price/150/200/product.html
https://rossirovetti.com/price/200/10000/product.html

…these pages are dynamic and not real .htmls, I rewrite them that way through Nginx for SEO purposes. The problem I’m having, though is that with Chrome, at least, after the 2nd or 3rd time going through the different pages, the same result (cached somewhere) keeps showing up. For example, the first time it may work correctly to see something between $50 - 70, but the second or third time retrying the page, I’m getting the results for another one (like, say the $70 - $90 price page).

I can’t get the same behaviour to happen in MS Edge, but I haven’t haven’t tried to fully blow up the list of browsers that it works or doesn’t work on since about 90% of our traffic uses Chrome, so I am primarily concerned with making it work for it.

I’m running CF with all of the caching and speed benefits possible on the pro plan, including railgun. I’ve tried setting up page rules so that https://rossirovetti.com/price/* turns off everything, including bypassing the cache, turning off railgun, etc., but this behaviour still exists. I didn’t have this before moving to CF, prior to that, the origin server did it’s thing just fine.

I’m assuming there’s something in the client header that’s being passed that’s making the browser re-populate the page with the same content. When forcing a hard-refresh (ctrl-F5), it works just fine. So that leads me to think that there’s some expiration header I need to force to be something different. I’ve tried some different combos with nginix hoping it’d respect it to CF but nothing still works. Outside of instructing users to hard-refresh their browers when browsing our website, I’m not sure how to go about fixing this (and that fix I just mentioned isn’t a fix since that’s not really an elegant solution).

Could I get some help from the CF Homeys out there who may have seen this? I’m pulling my hair out on this one and can’t for the life of me figure out how to fix it.

Thank you kindly,
Jason

Interesting, though you did say you tried bypassing Railgun in Page Rules. I’m curious, though. How are you getting Railgun on a Pro plan? That’s a Business/Enterprise feature, unless you’re going through a third party Cloudflare integration.

Can you post the full Response headers from one of these improperly cached requests?

This topic was automatically closed after 30 days. New replies are no longer allowed.