I’m working on making my own Git server (hosted with Gitea) and it’s been going great so far. I’ve gotten the thing online and can finally access it from networks other than my own (LOL). There’s just one issue, however. (And I really do mean just one, since literally everything else about Cloudflare is perfect.)
The issue is that when I’m cloning a big repository, such as a mirror of GCC, it fails. Details:
- I am not using classic port forwarding. I’m using Cloudflare Tunnel (argo) to get my website online. This is for security purposes. I’ve been through a 6-day DDoS attack on my home network and it really ain’t pretty.
- My server computer is a Raspberry Pi 4B 8GB. It has four ARM Cortex-A72 cores each clocked at 1.8GHz. It’s storing all my data on a super fast micro-SD card from SanDisk. It’s running Ubuntu Server 22.04 LTS for arm64. (The storage medium is not the issue here, I have verified that with the
iotop
andtop
tools while the issue happened through several tests.) - When I attempt to clone said big repository, Gitea makes a connection with the client and THEN initiates a long Git action that packs all the data to be delivered to the client into one file. Git is working at full speed using just one CPU core, and the micro-SD card is keeping up just fine so the single-core performance is causing the bottleneck here. (It cannot be multi-threaded.)
- Since said repository is so big, Git takes a REALLY long time to pack everything. (It worked when I connected to the server via local network since there is no timeout on my router.) Cloudflare then assumes that the server is overloaded (when it’s not since the web UI is still responsive) because Gitea establishes a connection with the client before the packed file is ready. About sixty seconds of anxiety later, the client’s Git executable exits saying that the Cloudflare proxy server returned HTTP 524. The clone is unsuccessful, not even partial since the pack file didn’t even start downloading to the client’s machine yet.
I have done some research and apparently this stuff doesn’t happen when you’re using the Enterprise plan. I currently cannot afford any plan other then Free, and I am not a company LOL. So, is it possible for Cloudflare to please make an exception for my (sub)domain or eliminate the timeout limitation for everyone? Since technically speaking, keeping so-called “time-outed” connections alive on the “superfast” proxy servers wouldn’t hurt since there’s no data flowing through it. (Or I’m misunderstanding and that’s how DDoS attacks work. Please clarify.)
I’m also new to all this server hosting stuff, but I have touched it before and understand what TCP/IP is and how the Internet works (DNS, difference between HTTP/HTTPS) etc. I’m not sure if this is a Cloudflare Tunnel issue or a CF Proxy issue, but I’m pretty sure it’s the latter. Correct me if I’m wrong. Throw in a possible solution while you’re at it, please.
My website is https://gitea.hdg57.eu.org
. The root domain, https://hdg57.eu.org
, is not active yet.