Hello, I recently set up a Cloudflare live stream and was testing the ability to play back from the SRT Playback URL (
srtPlayback attribute in the live stream API) when ingesting media to the input, either over RTMPS or SRT. The input stream in both cases contains H264 video and AAC audio. I am able to play the stream back with audio and video in both cases over the RTMP playback URL but when I try to play back the stream using the SRT playback address, there is no audio but the video works. Is that expected?
This is a sample GStreamer pipeline used to broadcast to my Cloudflare stream:
gst-launch-1.0 \ mpegtsmux name=mux ! queue ! srtsink 'uri=<my srt url>' \ videotestsrc ! video/x-raw,height=360,width=640 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264,profile=high ! queue ! mux. \ audiotestsrc ! audioconvert ! fdkaacenc ! queue ! mux.
And this is the playback command I used:
gst-play-1.0 '<my srt playback url>
I have also tested similar broadcasts using FFmpeg and OBS and have not been able to receive audio from the SRT playback stream while the RTMPS playback stream contains audio.