How to block hackers trying to access PHP files

Hello all,

I have in my logs hundreds of attempts to access various .php files. They don’t try to browse normal pages but go straight for the php files. Is there any way to block these requests?
My gut says that I could create a rate limiting rule for 5 to 10 requests for http and https*.php

Could someone correct me please, or direct me to the right solution?



I default all PHP to blocked, then whitelist only a few. I do this in web.config (Windows server) but this can be converted to .htaccess easily enough.

<!--# BAD URLS -->
<rule name="Disable PHP" stopProcessing="true">
	<match url=".*" ignoreCase="true" />
		<!-- disable php -->
		<add input="{REQUEST_FILENAME}" pattern="^.*\.php$" negate="false" ignoreCase="true" />
		<!-- allow php in the blog -->
		<add input="{REQUEST_FILENAME}" pattern="blog/*" negate="true" ignoreCase="true" />
		<!-- allow php in tinymce -->
		<add input="{REQUEST_FILENAME}" pattern="external_plugins/*" negate="true" ignoreCase="true" />
	<action type="CustomResponse" statusCode="410" statusReason="Gone" statusDescription="?" />
  1. Would that help you out?
  2. Do your php files exist or is this about bots fishing?
1 Like

The php files don’t exist, indeed it’s just bots fishing.
Some have 30 requests or so, some have hundreds of requests.

Then blocking like I did will work quite well. The web server will spit back an error before hitting the application server or a 404 handler because of stopProcessing="true".

You could create a rate limiting rule for the entire site where if a single response with a specific status code is returned… say 499 you block the user for a day.

1 Like

Interesting @cscharff How do I trigger that status code though?

Nevermind @cscharff I have the pro plan, so that limits my options for “limiting rules” .

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.