Purpose of CDN Content Delivery Network

I may post in the wrong section, but I could not find any other place to ask for support than this forum.

What is the purpose of CDN Content Delivery Network if the Origin gets slammed with requests even with only 1K Unique Visitors for a 24hours period. I thought it would pull an original “example” and deliver it to the pops around the world for faster connection and pull a new “example” after updates. But it seems its pulling all the time while bots have a good time looking at my website. AWStats says 1.4M Not viewed traffic while only 178K Viewed traffic within this month (May, 2024). I would’ve presumed these numbers would reflect being behind a CDN Content Delivery Network but “no”, still the same even while being “hidden” behind a CDN Content Delivery Network (Cloudflare).

The Robots.txt (https://kimsmovies.com/robots.txt) denies any bots to enter the website and to stay away even though while they welcome, but seems its the only visitors that like the website since they keep returning, and the denial has been to cut down on requests from robots since the server seems to be on heavy load due of their visits. Quite funny.

How am I to perceive CDN Content Delivery Network if the salespitch seems wrong. Or have I misunderstood the purpose of a CDN Content Delivery Network fully.

Just asking… keep up the good work though.

Best wishes,
Kim Jacobsen

It doesn’t. The amount of storage required to cache every file served through Cloudflare at every PoP for as long as every user wanted would be huge. Instead each PoP will independently get a copy from your origin when it is requested at that PoP. How long it is stored for will depend on how frequently it is requested, up to the time limit the user sets, but may be shorter if the file is not accessed often as it will be evicted to make room for more popular files.

If you want guaranteed cache storage, you can use Cache Reserve…

You should also note the default cache behaviour as that’s probably why you aren’t getting the cache hits you think you should have…

You can change that behaviour to your preferred caching requirements by using Cache Rules…

Only good bots obey robots.txt. Bad bots will do whatever they like.


I do use Tiered-Cache topology and Cache-Reserve. I’ve set the cache-expiry on 4hours. Will see if it helps.

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.