I monitor 20 sites with Uptime Robot, all on the same dedicated server, and all are monitored correctly, except for the two sites on Cloudflare.
Within the last 30 days, all of a sudden I’m getting constant alerts that my two sites on Cloudflare are down, even though they are up and running just fine. Uptime Robot reports a 403 Forbidden result every time I get an alert that a site is down.
I added Uptime Robot’s IP address ranges to the firewall rules, and nothing has changed.
I know my server isn’t blocking the requests because the other 18 sites that are not on Cloudflare do not report any issues and Uptime Robot can access those sites just fine.
I use Uptime Robot for my Cloudflare sites and I’m not getting 403s for them. You should be able to find something in some log somewhere. It may be in the Firewall Events Log list here at Cloudflare, or on your server itself.
That’s a good start. Then something else is forbidding the connection. You don’t happen to have Cloudflare Access set up for that domain, do you? You might also consider whitelisting that IP address range in Firewall Tools.