Using "I am Under Attack Mode" Permanently a good idea?

So Here is the current scenario.

My Website climbed in the SERPS and now in page 1 for many competitive terms.

As a result it have attracted many bad attention of competition who will do anything to stay on top.

Recently, I believe I am facing a lot of ddos attempts and keeping security level to “I am Under Attack Mode” is the only solution for me.

So, is it a good idea to keep my website permanent on “I am Under Attack Mode” .

Since, 5 sec delay is there I am not sure if it will effect my rankings as search engines give preference to high speed websites on rankings.

Will it effect my Conversion Ratio?

I am not sure what to do in this situation and would like to hear some suggestions.

Thank You

As a user, I get pretty tired of sites that use Under Attack Mode for everything, including their main page.

Considering that this (DDoS attacks) is a common situation for may popular sites, now would be a good time to more closely analyze the attack tactics.

What signs of DDoS are you seeing?


Major sign of DDOS I am seeing is that my

I/O Usage is always full at 1mbps which prevents website loading.

I am more concerned about SEO effects than bounce rates because I can afford to lose some users at cost of my security.

My website traffic depends upon SEO only so I don’t know if keeping website under “I am Under Attack” mode will cause any -ve SEO effects.

With I’m Under Attack mode enabled for an extensive period of time you could start seeing crawling anomalies on your Google Search Console, followed by the inevitable de-indexing of pages.

Googlebot and other search engines are whitelisted by default, according to Cloudflare:

We’ve also designed the new checks to not block search engine crawlers, your existing whitelists, and other pre-vetted traffic. As a result, enabling I’m Under Attack Mode will not negatively impact your SEO or known legitimate visitors.

But it seems that search engines will at times attempt to visit your site incognito, without Googlebot or any other easily guessable user agent, and the result will be crawl issues. These incognito visits are like auditing, and are supposedly meant to make sure that Googlebot is not being fed a “good” site, while human visitors are getting something else.

What I’d do instead is to examine origin server logs after patterns that could match the malicious behavior and try to create Firewall Rules and Access Policies accordingly.

Also, high CPU usage may be the result of poor coding. Sometimes all it takes is a plugin update that has a bug in it.


So do I. It’s very annoying, tbh “5 second” is a lot of time nowadays :slight_smile:


Tremendously usefully when needed, not appreciated for cosmetic purposes.


Any study to backup this googlebot incognito visits… anyway I have removed 3rd party resource consuming scripts from my main hosting space and moved them to amazon cloud and have researched about wordpress xmlrpc.php and wp-corn.php attacks and disabled them both. Lets see how this effects.

I don’t use xmlrpc, so I block that with Cloudflare Firewall Rules.

Oh…you mean wp-cron? That’s not an attack vector I’ve seen on my sites. But Firewall Rules might work for that one as well.

I also use Wordfence, which does a great job at protecting my site for free.

Sorry, I don’t. I’ve read this more than once on Google’s webmaster forum, but can’t find exactly where.

Also, if you don’t use xmlrpc.php, I suggest you create an Access Policy for the path /xmlrpc.php, which will prevent access instead of letting them in to your server to get a 404 or some other error msg, as a 404 also uses up cpu/bandwidth resources.

As for wp-cron.php, you should only disable it if you create a “real cron job”, which is quite easy to setup and should reduce the amount of requests. There are many plugins that depend on wp-cron.php to work for them to function properly.

I didn’t think wp-cron is called via the URL…unless you’re manually calling it from crontab. When I do that, I add the origin IP address to the hosts file so it hits it locally instead of through Cloudflare.

wp-corn.php is not a attack vector but very resource consuming.

Give it a read here

I had to disable it because my I/O Usage was getting flooded and it could be one of the reasons.

You might not notice these issues with good hosting but I have to deal with offshore hosts due to the niche of my website and getting a good hardware for affordable prices are hard there lol.
Wordfence is there on my website.

Thanks for your input m8 I really appreciate it.

BTW you seem very knowledgeable and experienced in this.

What hosting do you suggest?
Any offshore DMCA ignore host you can recommend?

Sorry, but I have no idea what a DMCA ignore host is.

I use Siteground and am very happy with it. It recently added staging to their second-least expensive hosting plan, and daily website backup for 30 days on all plans. I’ve used Hostinger in the past, it was OK, but offered no server logs, so I gave it up.

Providers that ignore DMCA requests for removal of copyright infringing content.


This topic was automatically closed after 31 days. New replies are no longer allowed.