You will see the chars are not corrupted. The only difference is that in the second I specified the meta charset at the head BUT untill the day I reported this problem, it was never necessary to use a meta charset. All the browsers would simply guess correctly the charset.
The bug I reported previously and that you already fixed, was related to the charset being specified in the header of the request (not in the head of the HTML file). Now the same bug is happening, but with the meta tag chart set in the head
Hi @contato your HTML is relying on a heuristic at browser level which does not guarantee to be working on every browser. I checked the zone you mentioned in the previous topic (https://www.sitepor500.com.br/) and it works correctly
I would recommend to either:
add content-type charset header or meta tag - you can also use a transform rule to add the desired charset in the content-type header
disable HTML modification features such as minify, server side excludes and email obfuscation
Chrome is currently using a heuristic algorithm to guess your encoding is windows-1252. There is no guarantee this is correct and it can change in any Chrome release.
Below is exactly the same file but with a direct hit on my server (there is a certificate error because of an expired certificate, just add an exception so you can open the link):
REMOVED THE LINK TO HIDE MY IP SERVER
You can clearly see that Cloudflare is messing around with something that until a few days ago it was not messing. Cloudflare is simply breaking compatibility for milions of websites - it does not matter if users should not rely on browser heuristics, it matters that cloudflare is breaking things that are working.
If he follows the recommendation and âletâs you know how it goesâ that still doesnât explain why the behavior changed depending on whether or not the website is being proxied.
If it were purely a change in browser behavior the proxy status wouldnât have an impact.
@contato the team is heads down building half baked features for developer week. Perhaps when they are done releasing new features that donât work, they can go back and focus on existing ones which donât work.
While I understand what youâre suggesting here will likely resolve it for this user, and would be a better long term solution, it still sounds like there was still some kind of breaking change or regression here. It doesnât matter what the user can do to fix it - if something used to work with cf enabled and now doesnât - thatâs on Cloudflare.
If this is indeed a regression or breaking change, an incident should be raised and this better investigated and/or fixed/rolled-back to where it doesnât impact customer sites.
@mabba@albert Thatâs crazy how the support team from Cloudflare dont give a damn about backwards compability. Clearly, 100% clearly, Cloudflare broke something.
Browsers have a very sophitistacted algorithm to correctly guess the encoding, and most of the times they guess it correctly. Despite the effort of browsers devs to create a very nice and reliable algorithm to detect encoding, CF simply decides to break it and doesnt give absolute any chance for the browser to correctly guess the encoding since you already broke it⌠and itâs nice of you to blame me, a nice touch.
CF is losing track of what they represented. Thatâs always what happens when a company gets big. Lets wait for the big competitors to get into this market. Itâs well known that Microsoft and Alphabet are implementing free CDN solutions in the near future for âprivacy concernsâ⌠so thatâs that.
Thank you for blaming me on this encoding issue you created and thank you for devaluating the effort browsers do to guess the correct encoding⌠itâs nice of you to simply break the chars and not give the browsers any chance to correctly render the enconding.
Both links I provided are proof that Cloudflare is the one to blame: both links serve exactly the same file, one uses cloudflare and the other does not. Anyone can easily reproduce it