My blog has been subjected to a DDoS attack and so I now use Cloudflare. However, Bloglovin used to scrape my blog so readers could read all their blogs in one place. Since using Cloudflare, Bloglovin have been unable to scrape any of my posts. I searched and obtained what I thought to be the IP addresses of the Bloglovin servers and setup a firewall rule to allow them to get through unchallenged. However things still don’t appear to be working as expected.
Does anyone here have any experience in created firewall rules for Bloglovin or any suggestions on how I can fix this issue?
Sandro, I used a firewall rule; I created one for known bots too. At present I’m a non-paying user and the access settings require a payment. Is this available for evaluation before committing to paying; just in case it doesn’t work either?
Sorry for the long delay in responding. I’ve now added access for the IPv4 addresses and IPv6. It looks like Bloglovin use Cloudflare as their CDN too. I’ll give it a while to see if the settings work.
No luck here. The IP addresses must not be correct since the https://www.bloginlovin.com whitelisting still isn’t allowing the pages of my blog to be viewed there. Does anyone have Cloudflare and Bloglovin working nicely with each other?
tThecraftyowl.co.uk was subjected to a DDoS attack a few months ago. I employed the services of Cloudflare (as suggested by my ISP) and that seems to have allowed them to activate the site again. However, I tried switching it off “I’m under attack” about a month after it started and the attack was still going on and it caused them to switch the site off until I had activated Cloudflare’s “I am Under Attack” mode. They subsequently switched the site back on and I haven’t deactivated the attack mode since.
I have some bots (Bloglovin.com) that skim the site and consolidate my posts. This service no longer works since the bot can’t negotiate the attack feature of Cloudflare. I also use PayPal for some online payments and the post-back from there has also stopped working.
This is causing me substantial issues and I am hoping you can help.
Is there any way (through your logs) of determining where the DDoS is coming from and if it can be stopped? Are they attacking a particular page that might be alterable to stop the attack? Are they using the IP address or the domain name? If it’s the IP address would switching it to a different host/server IP help? I have SSL certs but they non-secure pages are still visible. Is it possible that is the issue?
If you can think of anything then please let me know. I am close to moving the site to WordPress.com in its entirety and only maintain a secure blog.
I hope this message finds you well. I am reaching out with a pressing concern regarding my website, which has been experiencing scraping issues leading to unauthorized use of its content. Unfortunately, the situation escalated to the point where parts of the site were blocked, and we had to delete significant content as a result.
I came across a relevant blog post on the community forum about stopping scraping (Bloglovin has stopped scraping my blog). However, since the thread is closed after 30 days, I couldn’t add my question there.
Here are my specific queries:
Preventing Content Scraping:
We have implemented Cloudflare, but it seems we are still vulnerable to scraping. What additional measures or configurations can I adopt within Cloudflare to better protect from content scraping?
Recovery from Blocks:
Due to scraping issues, we had to take drastic measures, resulting in the deletion of substantial content. How can we recover from this situation and restore the functionality of the blocked sections?
Enhanced Security Recommendations:
Are there specific security features within Cloudflare that I might not be utilizing effectively to prevent scraping attempts? Any recommendations to enhance the overall security posture of our website would be greatly appreciated.
I understand the complexities involved in mitigating scraping attempts, and I am eager to implement proactive measures to safeguard our content and ensure a seamless user experience.