First live stream testing - lots of buffering?

I just created a free Cloudflare account and added the $5 streaming to do some testing - we are looking for alternatives to wowza as a backup.

When doing a 6Mbit 1080p livestream, I see a lot of buffering and eventually very high delay in the players. (using the THEOplayer | Player Statistics and same results on my cloudflare dashboard with the cloudflare player). I ingest from Germany from a 100Mbit uplink, and streaming to wowza (Frankfurt) and our own servers (Munich) works perfect.
As far as I understand, I cannot tweak any settings for the transcoders. It looks like the 2sec HLS packets are just not delivered fast enough through the CDN.

I also find it annoying not to know the m3u8 playlist link beforehand since this changes every time I start to stream. Do I really have to have a script running on one of our servers, that keeps polling the cloudflare API to figure out the playlist link once the stream starts? So there is no way to have a simple player page using Theoplayer or videojs somewhere that starts playing once the stream is active?

Any suggestions or anything I can do about that?

1 Like

I’m surprised @michael hasn’t tried this out already. Maybe this will be his big weekend activity.

The issues described are very similar to the issues I encountered running short term tests putting audio only streams behind CF.

Be interested to see the responses & other peoples experience with the Stream Live project.

I did some tire kicking last night, but mostly to see what I could do with the default player VAST tags. Not really setup to do live streaming from my laptop. (Amazingly I don’t happen to have an enterprise grade video encoder and a load of SDI sources in my home office right now).



The buffering that was described here was what I was seeing too. I tested 4 Mbps, 6 Mbps, and 8 Mbps streams. The ingesting was going fine with OBS, but watching the stream was a buffering mess - even at low quality in the video player. I then had others confirm that it was buffering too.

I also find this extremely limiting.

We are working to fix the buffering issue.

And agree, being able to have a non-changing m3u8 URL would be useful. At the moment, the upside of creating a different video ids (and m3u8 urls) is that the recorded video can be watched with the same video id after the event. The trade off is that you don’t have a non-changing url for watching the live feed. We are looking at ways to address this.


Echoing what Zaid said above:

  1. Buffering going away. Agree this is unacceptable. We’re working on it. Will post update here.
  2. There will be a way to get URLs beforehand that points to the latest live video. You will have option to choose: get URL that converts to VOD (and stays the same as live) or get URL for latest live session.
1 Like

Is there any ETA on a fix for this? I’ve tested again with my gigabit internet connection and it’s still a problem. It doesn’t feel production ready. I see so much potential here!


As far as I know from experience in streaming platforms. The buffer happens not because there’s an issue with CF, but because the player changing the bit-rate of the live steam like crazy! :-)… need to find a solution by changing automatically between the bit-rates smoothly or the viewer has to change it by him self depends of the internet speed available in his device.
I tried 2 different players with our live stream, the commercial JW and THEOplayer, both has same issue. But when I used the ( ?clientBandwidthHint=1.8 ) at the end of the HLS URL you can watch with no issues at all. But you have one bit-rate locked on.
Example (, for us we don’t mind but we had issue when we used this technique on our Apps, the Apps wont play this URL because it ends with something other than .m3u8, but you can use this link as I said before with both JW and THEOplayer with no issue.
Also, for people streaming 24/7 specially using Apps for their stream it they needs a static HLS URL, otherwise there’s no point to offer live stream for real productions!


1 Like

Hey! I’ve got some update for this thread. We’ve identified a number of issues that is causing buffering and started fixing them one by one over the last 10 days this product has been available.

The biggest improvements we’ve made so far is reducing the segment loading latency from 2-3 seconds to about 50-100 milliseconds. In addition, we’ve identified and fixed an issue where buffering could be caused when the input frame rate is not 30 frames per second.

More to come here, and will continue to post updates, including a static link that points to the live broadcast regardless of the recording.