Access denied | singlemothers(dot)us used Cloudflare to restrict access



How can I prevent bots from accessing the page, I have used the free ssl in cloudflare but later then a lot of pages got de-indexed. I’m really not smart into this, please help whats going on.


HTTP/1.1 403 Forbidden
Date: Thu, 05 Jul 2018 05:10:39 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
Set-Cookie: __cfduid=d14acc3e785ab87bc1a8c40478f9a99631530767439; expires=Fri, 05-Jul-19 05:10:39 GMT; path=/; domain=.singlemothers(dot)us; HttpOnly; Secure
Cache-Control: max-age=10
Expires: Thu, 05 Jul 2018 05:10:49 GMT
X-Frame-Options: SAMEORIGIN
Expect-CT: max-age=604800, report-uri=“
Server: cloudflare
CF-RAY: 43575390fc0a6415-FRA


Access denied | used Cloudflare to restrict access body{margin:0;padding:0}
Please enable cookies.

Error 1010 Ray ID: 43575390fc0a6415 • 2018-07-05 05:10:39 UTC

Access denied

  <section></section><!-- spacer -->

  <div class="cf-section cf-wrapper">
    <div class="cf-columns two">
      <div class="cf-column">
        <h2 data-translate="what_happened">What happened?</h2>
        <p>The owner of this website (singlemothers(dot)us) has banned your access based on your browser's signature (43575390fc0a6415-ua100).</p>

  </div><!-- /.section -->

  <div class="cf-error-footer cf-wrapper">

Cloudflare Ray ID: 43575390fc0a6415 Your IP: Performance & security by Cloudflare

</div><!-- /#cf-error-details -->


SSL has nothing to do with blocking bots. The errors you list are just blocking you from accessing the site.

Bot visits are a normal part of running websites. They crawl and poke around, even if you don’t want them to.

The only easy way to block bots is to set your Security Level (In Cloudflare Firewall settings) to “Under Attack.” This momentarily pauses access while Cloudflare determines if the visitor is a bot or not. This is an inconvenience to your human visitors and is unadvisable.


Could this be the possible way to save de-indexed pages? thanks a lot.

I’m thinking also about assigning which to allow and dis-allow in robots.txt


Under Attack would either apply to your entire site, or via a Page Rule to specific matches.

You should certainly use robots.txt to discourage search engines from hitting pages and directories you don’t want scanned.


By any chance can please help me what action I need to do to fixed this type of error? I’m still struggling what makes the pages de-indexed. I was thinking it cause by cloudflare, but no Idea really how to fix. Very much appreciated.

this is the screenshot of the error:


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.