Uptime Robot 403 Forbidden

I monitor 20 sites with Uptime Robot, all on the same dedicated server, and all are monitored correctly, except for the two sites on Cloudflare.

Within the last 30 days, all of a sudden I’m getting constant alerts that my two sites on Cloudflare are down, even though they are up and running just fine. Uptime Robot reports a 403 Forbidden result every time I get an alert that a site is down.

I added Uptime Robot’s IP address ranges to the firewall rules, and nothing has changed.

I know my server isn’t blocking the requests because the other 18 sites that are not on Cloudflare do not report any issues and Uptime Robot can access those sites just fine.

Any ideas what could be causing the issue here?

I use Uptime Robot for my Cloudflare sites and I’m not getting 403s for them. You should be able to find something in some log somewhere. It may be in the Firewall Events Log list here at Cloudflare, or on your server itself.

I can see in the firewall events that the firewall rule I added is being hit with the action taken “Allow”.

That’s a good start. Then something else is forbidding the connection. You don’t happen to have Cloudflare Access set up for that domain, do you? You might also consider whitelisting that IP address range in Firewall Tools.

I do not have Cloudflare Access set up.

Try another monitoring service. I’ve never used Uptime Robot, but I’m using WebSitePulse for years now, and haven’t had any problems with Cloudflare.

This topic was automatically closed after 30 days. New replies are no longer allowed.