Universal SSL possible data corruption?

Hello folks

I suspect that my website is experiencing problems with SSL connections, with some kind of misconfiguration on Cloudflare’s side, as I’ve been losing a lot of traffic for a year now, also regarding bad behavior with common attacks, are drastically decreased, and there’s a lot of unsecure requests. However, from my side I’ve never had any SSL problem.

I’m currently using Full (strict) config, and Universal edge certificate issued by DigiCert, with 1-year validity (why so long?), in DOCs I also read that DigiCert will be deprecated soon.

Whenever I try to disable and re-enable Universal SSL I always get the same certificate type, so DigiCert with 1-year validity with the backup certificate that expires after 3 months, when other websites using my same Cloudlare IPs are getting Google Trust Services certificates, though.

In the past I had also purchased a Certificate Manager subscription with Let’s Encrypt, but after 3 months the traffic continued to decrease. My suspect is that the certificate is not issued and/or corrupted in “sni cloudflaressl com” for my domain through several colos.

Now, since I can’t contact the support about this issue, even if Cloudflare is my registar, I was wondering if anyone of the support can deeply check the website’s SSL confs, or if there’re a way to completely reset the “sni/cloudflaressl/com” config, by hard coding or with some API’s request?

Maybe there’s a method to validate my suspicion :frowning:


Below article might help a bit:

I doubt you had lost web traffic due to this :thinking:


Thank you

My doubt was not for the type of certificate issued, but for the Certificate Provisioning incorrectly added to the SSL verification records by Universal SSL for all colos.

Since the domain uses A record for both ROOT and WWW without CNAME setup, I could try to move WWW to CNAME, as seen from the post below, but CNAME setup needs more data in DNS tab, but I don’t know what refer.

The site also uses HSTS and DNSSEC, so can I try disabling them?

Now the human traffic is practically null, when other websites in the same VPS are working fine with Cloudflare, so the problem is domain related :expressionless:

Is that suspicion alone based on what you see as a loss of traffic?

Any evidence of such problems (or the misconfiguration)? If so, please share it, so we can dig deeper with you? :slight_smile:

That said, there is absolutely no way that the user(s) that you had, simply moved on and wanted to do something else, and that way “left” your site, so that the loss of traffic for example could be due to your user’s loss of interest? :thinking:

Stuff like that wouldn’t make a difference, as it your website appears to be working just fine.

I’m :+1:'ing on this.

1 Like

My suspicion comes from the metrics, unsecure vs secure connections, almost 1/3 of the traffic is unsecure (therefore certificate not issued, not requested or totally not working) since the port 80 is externally closed.
Screenshot ssl_cloudflare

When another website on the server has only minimally unsecure connections vs regular traffic.
Screenshot ssl_cloudflare2

The website has been on Cloudflare for almost 10 years now, and the domain is 20/yo, so I think I’ve some experience with that. :expressionless:

So I wanted to underline that not only human traffic has decreased but also regarding common attacks, eg Wordpress, Joomla, Cpanel etc. and the biggest attacks come almost exclusively from known host networks, Google, Microsoft, Amazon etc.

If domain has SSL connection problems of some kind, then it also loses its authority, I think, also because I see that now the website is no longer reached from various locations (Africa, India, South East Asia, South America, Oceania) only US and UK traffic mostly, including regular bots.

Also, not all errors are monitored on the CF edge, for example Bad Gateway errors on proxies (NGINX white pages) aren’t reported in the stats.

I think the only way to validate my doubt, is to remove the site from Cloudflare, but due to the firewall rules I can’t do that, so just move it to another server with its valid certificate once I leave Cloudflare :frowning:

I would probably go as far as to say that you cannot base it alone on that:

There are certificates issued for the domain mentioned, valid certificates that works perfectly.

So first part (not issued) as well as the last part (totally not working) falls apart here.

Not requested is however a different question.

If I am requesting your site as http://, that would come up nuder “None (not secure)”, if my device leans towards the http:// version, and then your server decides to redirect it to https:// afterwards, then I will still count as a request for “None (not secure)”.

If I am running a legacy system from the Stone Age, and that it does not trust your certificate, that would mean that I could end up on “None (not secure)”, but that I will NEVER get to TLS v1.2 or TLS v1.3.

If you have disabled e.g. TLS 1.0 and TLS 1.1, you’re similarly excluding secure connections from some of those legacy Stone Age systems, meaning that I would end up on “None (not secure)”, but likewise NEVER get to TLS v1.2 or TLS v1.3.

Should you try to eliminate those legacy connections, or attempt to still be supporting them?

While I can give my recommendations about which directions I would take there, I cannot make the final decision for you.

You can decide whether or not that your server (behind Cloudflare) has port 80 externally closed.

Cloudflare has it “externally open”, when you’re having Proxied (:orange:) records.

A HTTP/Port 80 request to your site, on Proxied (:orange:) records, will therefore work just fine, and the connections to Cloudflare’s HTTP Proxies on port 80, are the ones ending up in “None (not secure)”.

For this one, if I’ve said something that you understood as that I was questioning your experience, expertise, or otherwise talking down to you, or stuff similar to that, that has → NOT ← been my intention.

I am simply trying to do my best, to help you.

If either the human visitors or the bots are staying in the Stone age as mentioned above, that one could indeed give problems.

Cluodflare has always, on the Free plan, been limited to “modern” browsers.

I’ve tested from multiple locations across the world, and there, your site appears to work just fine.

That said, I am under the impression that if there are actually errors within Cloudflare, that it is in Cloudflare’s interest to get them fix fixed.

You’re however having a lot of assumptions, guesses, “I think” around your stuff, which makes it hard (if not impossible) to dig deeper in to the case, as your site actually appears to work just fine.

In other words: How exactly do we reproduce the situation you appear you see?

Can you share the domain of that other website?

That way, we could at least look in to how and what appears different, from the one website to the other.


Thanks for the suggestions.

But in the past, I had also activated the TLSv1 in legacy connection, but it didn’t change the situation, even having a Pro account, so I had decided to cancel the subscription, since it’s useless having my amount of traffic, so I had bought the ACM subscription, to support a low SSL level, but after 3 months nothing has changed.

I would like to mention, that old browsers don’t pass challenged pages, since CF blocks access by displaying a system deprecation message, so even activating legacy connections, old browsers don’t pass, at all, although, this traffic is always “counted” in CF stats.

Having changed many options in the dashboard, I’m pretty sure I can’t solve the situation in a short time, and as I said before, the only way to verify my doubts is to remove CF and move the website, also because, there’re several posts in community, about users who complain the same situation, even losing domain authority, with others who have abandoned the service permanently.

I’m convinced that not even CF’s techies know the system deeply, and several websites can suffer from issues that are impossible to trace, maybe just using an external software, such as wireshark to collect packet captures with traffic analysis, and see what happens, but too difficult to implement in the real world.

Also, mine is a very simple website, no databases, no CMS software, just static html pages, and the server is fully tested to work well with the latest technologies, so H2 and TLSv1.3, and internally no SSL errors are shown or reported with both Openssl and Curl, using the CF Origin certificate in full strict.

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.