Serve ads.txt from original server and not CloudFlare

Hi

I need to serve the file /ads.txt from my original server for all combinations and not from Cloudflare

https://www.domain/ads.txt
https://domain/ads.txt
http://www.domain/ads.txt
http://domain/ads.txt

How can set this on CF?

tnx
Yakov

If you want to use Cloudflare for the other files on the site, then it can’t be done. It’s all or nothing since the domain/subdomain will point to either Cloudflare or your origin.

(You could redirect ads.txt to an unproxied subdomain, but I guess that will break things for the crawler).

Thank you Sjr for your answer

If this is the request of Google -

they need to ask their CDN service provider to return the new version of ads.txt to Google crawler which will help you to fix this issue

Then you say that the only way is to switch the Proxy status to DNS only - and lose the benefits of CF?

Did I understand correctly?

If you just want the latest version, just ensure ads.txt is never cached. Do that here…
https://dash.cloudflare.com/?to=/:account/:zone/caching/cache-rules

1 Like

is this correct?

II would suggest to change *.appointment to *appointment if you want the rule to apply to the domain (or matching the leading . will only happen with subdomains).

You can check by loading the page in your browser and check for the cf-cache header.

Note, cache rules would be a better place to set this as page rules will be deprecated at some point.

1 Like

Then this one is better?

That will work, but maybe be more specific, use URI equals /ads.txt

1 Like

Hi Sjr

I have set the cache rule

I opened the file on my browser
https://www.appointment-centers.org/ads.txt

and this is the header

This is the result that we are looking for to solve Google problem?

1 Like

The Cf-Cache-Status of “DYNAMIC” means it’s not being cached. Assuming that indeed was the problem, it should be fixed.

3 Likes

Thank you guys for your help!

1 Like

What advice are you seeking that hasn’t already been covered in this topic?

As noted above, you can’t do that. But you can use cache rules to ensure the file is never cached, solution above. (I don’t think .txt files, apart from robots.txt, are cached by default anyway).

1 Like