I am embedded C developer, new to web development.
I have been asked to make a catalogue site for bespoke art. I have a conflicting requirement: I need routes like
domain.tld/artists/:work to be
- viewable by only signed-in users
- scrap-able by search-engine spiders so we can benefit from SEO
- not scrap-able by bad/attacker bots so as to keep server & bandwidth costs at a minimum
I reckon this is doable (and please correct me if I am wrong) by using a proxy server using something like botd.js (by fingerprintjs). But since it’s better to write less code, I’m wondering how can I integrate
- Rate Limiting | Advanced Network Rate Limiting | Cloudflare UK
into my work. Are all these offerings baked into Cloudflare Workers? Is there a tutorial on leveraging these on a quasi-ready backend?