Is there any programmatic facility to prefetch a given (HTTP) resource in your network edge cache/CDN?
To elaborate: I’m evaluating leveraging CF’s caching to absorb what can be thought of as “live” stream multicasting, except very low bandwidth (napkin math says about 200B/s, at max, uncompressed). To interop with CF, the plan is to chunk the stream into some balance of size/overhead v latency, let’s say a minute worth, which the clients fetch using a pull-model to allow for the CF caching to engage. The problem with this approach is that, with no way to fault a new chunk into the cache preemptively, there will be a very high [cache] miss rate as readers all rush for the next chunk, the first request possibly not even retiring while all others are still in flight between CF<>origin server.
I’m aware I mitigate this to an extent; staggering the client requests (in time) and/or emulating a request myself to preload the cache but these aren’t too elegant.
On other plans, you can build prefetch logic in a Worker, to make a synthetic subrequest for the next numerical fragment.
Just be aware, streaming is generally against the Cloudflare terms of service, unless you are using certain paid add ons, or have other agreement. But at 200Bps, not sure if that qualifies as streaming media.
As this is a mere hobby project the Enterprise plan is quite out of the question.
A cursory read of the docs suggests utilizing a Worker seems like it may indeed be a viable solution indeed, although ideally there would be a facility to prefetch via say, the HTTP-based interface CF exposes.
Yes I reasoned given this traffic would be basically noise that it wouldn’t be a problem policy-wise.
An additional question: What is the granularity of the resource cache? I would prefer to round to sizes of HTTP resources to some native multiple of the CF cache as to avoid internal fragmentation/waste.