Prevent caching of dynamic robots.txt

I have a dynamic robots.txt file created in php that I cannot prevent CF from caching. Here is the scenario:
I have a rewrite rule in .htaccess that silently rewrites (not redirects) robots.txt to robots.php. The mime type is correct in the php header.[header('Content-type: text/plain');]
Without spending another page rule, should I not be able to prevent caching via manipulating the php header? What I have tried:
`header(“Cache-Control: no-store, max-age= 0, must-revalidate”);

header(“Cache-Control: no-cache, must-revalidate”);

header(“Cache-Control: no-cache, must-revalidate, max-age=0”);

header("Expires: ". date(‘D, d M Y H:i:s T’, strtotime(’-1 second’)));

header(“Pragma: no-cache”);`

and various combinations of the above.
I have “Cache everything enabled.”
I have tried adding the following to the .htaccess rules:
`<Files “robots.php”>#this has to be .php because that is what the server “sees” .txt does not work (confirmed with a Header Set Service-Worker-allowed “/” directive)

Header set Cache-Control "no-cache, no-store, must-revalidate, max-age=0" #(and various combinations)

Header set Pragma "no-cache"

Header set Expires 0

ExpiresActive Off

`

I have 2 page rules to cache *example.com/* and to bypass cache, apps and performance for *.example.com/*.php*
Any suggestions?
Thanks!
-Brian

Hi there,

Page rules will override your headers. From my experiences, if you set “Cache everything” and “Edge Cache TTL”, Cloudflare will ignore the cache-control Header.

If you want to control the cache using headers, you have to turn on “Origin cache control” and disable the “edge cache ttl” rule.

1 Like

Thanks, mobilon. I’ll try that.
-Brian

Yup. That worked. Thanks for saving my sanity, mobilon!
-Brian

1 Like

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.