Hi,
The following settings are written in Cloudflare’s Cache Rules.
(Field)URI (Operator)equals (Value)*/robots.txt
(Cache eligibility)Eligible for cache
However, the cache status becomes “skip” and the cache is bypassed.
Do you know why this happens?
Also, please let me know if there is a solution to make it Eligible for cache.
Laudian
3
*/robots.txt
is not a valid URI. You are probably looking for Uri Path equals /robots.txt
.
However, the robots.txt
is supposed to be cached by default. Did you maybe create any Rules that would disable caching for your robots.txt
?
Thank you for that quick and detailed response!
Even if I disable all Cloudflare Cache Rules, the result remains the same.
I’m using a virtual robots.txt, does this have anything to do with this problem?
Laudian
5
Can you share a link to your robots.txt
in preformatted text (ctrl+e)?
Laudian
7
Your origin is setting Cache-Control: no-cache, no-store, must-revalidate
. That means it is instructing Cloudflare not to cache your robots.txt
.
Either disable those headers on your origin or override them in the cache rule by setting
Edge TTL
Ignore cache-control header and use this TTL
1 Like
Thank you!
I will try to deal with it using the method you suggested.
system
Closed
9
This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.