What amazes me is the lack of concern on latency for web streaming.
We use the internet (and IP in general) to stream video. At high bitrates (200mbit+) we aim for sub 100ms end to end, for compressed services we're happy with 500ms, maybe upto a second if it's something like Sydney to London over the internet.
I was in a control room a couple of weeks ago watching some football. There were two displays, one end was the feed from the stadium, one was the feed from the web streaming service.
There were cheers and then groans from the live end of the room. nearly a minute later someone on the web end started running up the field to score. Of course I knew at that point that it wouldn't be a goal, as not only did the people watching the live stream tell me, but twitter was abuzz.
1 minute end to end delivery latency is shocking for this type of program. Heck 10 seconds is bad enough.
1. network latency, in milliseconds, that affects stream quality and stability
2. the delay (lag) between real-time capture and what the end user is seeing; this one is usually measured in seconds. A stream needs to be ingested, transcoded, sent from distribution servers to edge servers in each target region - with each step adding to the delay.
Minimizing the lag is very hard, because stripping all buffers (to reduce the delay) makes the stream very sensitive to network conditions (which reduces quality). With most commercial CDN providers you will get 5..10 seconds. It can be reduced to 2..3 sec if you know what you're doing.
edit: in case anyone is interested - in the second scenario, where we achieved 2..3 sec broadcast lag vs real-time, stream source (ingestion) was in the US, and the viewers were in mainland China. Network latency was over 600 msec. Wasn't easy!
60 seconds in the case of iplayer for the fa cup. 1m20 in the case of BBC News channel right now. HLS tends to be packetised in something like 15 second chunks at the end of the process.
I know why there's a delay, I'm just amazed that people aren't concerned about it. The BBC used to offer multicast sources of live TV, which is a far more sensible solution, far more bandwidth efficient and allows end-to-end transmission in the satellite (or even less) range.
Wowza did a talk at demuxed last year about how to do "3 second latency end to end at scale", which I found amusing given that TV people have been doing sub millisecond latency at scale for nearly 100 years, so at least some people in the industry recognize the problem (which is mainly for sports events)
Hey, I was also at Demuxed last year! I help maintain Hls.js; it should have some form of LHLS support by soon(tm). There's no standard yet so actual adoption will be tough (current implementations use non-standard EXT-X tags to signal LHLS). But pretty much everyone does the same thing: early signaling of segments, chunked transfers, and on the client side, progressive demuxing. HTTP-based solutions typically achieve the aforementioned 3 seconds, but anyone talking about LHLS now (Akamai and Wowza) have their own protocols; it remains to be seen what the rest of us will get.
Twitch has recently implemented LHLS (looks like a "periscope-style" implementation) and I was seeing 1.2s glass-to-glass.
I hate that I try to stream a game for my friends and they don't see what I did til 10-15s later. I'm trying to have discussions with them in real time on our voice chat server and any input they give me is inherently dated. Mumble gives extremely low latency, so achieving this kind of thing over the internet isn't exactly impossible, the bandwidth requirements of video would probably make it a bit more iffy, but should still be doable with some random issues.
Maybe I just need to get away from the public streaming services which use HLS, switch to UDP streams and sub-1s buffer sizes.
Honestly, I think people are concerned, but everyone's experience to date with big streaming events has been that something invariably goes wrong - people can't connect, login issues, event won't start, buffering, etc (e.g. the McGregor Mayweather PPV). If you can get a high-quality stream at all, perhaps the time shift is secondary!
I only usually deal with the distribution side as an end user, and personally I tend to watch about 2 live events a year (I'd watch new years, but that's clearly pointless, so that leaves eurovision and maybe an election program), so I don't have much experience with that side.
It does amuse me when we were looking at latency for a program from a ropey bit of connectivity which we were using ARQ on. We were discussing whether we could push the latency up from 2 seconds to 6 seconds (it kept dropping out for 2 or 3 seconds at a time), as it's sport. Then we realised there was a good 30-40 seconds downstream before it even left to the CDN!
I still don't understand half of what Streampunk [1] are trying to do with their nmos grain workflows, but they are talking about sub-frame HTTP units
This is not an approach that supports line-synced timing and may not be appropriate for live sports action that requires extremely low latency.
However, for many current SDI workflows that can tolerate a small delay, this approach is sufficient.
With UHD you're talking 20MBytes for a single frame, or each "grain" (a subdivision of a frame) being in the order of a millisecond/megabyte.
I think I prefer this approach to the SMPTE 2110 approach to be honest, especially with the timing windows that 2110 requires (it doesn't lead well to a COTS virtualised environment when your packets have to be emitted at a specific microsecond)
In the US we've seen a half-dozen over-the-top TV providers launch in the past couple of years. They all seem to understand that live sports is their core feature, yet none of them tries to compete on latency. I regularly see two or more minutes from reading tweets about a goal or touchdown (from broadcast or cable viewers) until I see that TD live on Sling / PSVue / DirecTV NOW / YouTube TV (I've tried them all). Second-screening live sports is impossible with the OTT apps, and is very likely to drive me back to Comcast.
There's a real opportunity for a sports-oriented OTT company to compete on latency, a DVR that actually works, and expansive rights (I never have to guess if I have access to any sporting event).
For games like American Football, there's only 11 actual minutes of action, but it takes 3+ hours to complete the game[0]. That leaves a lot of time to watch a second screen.
Second screening even in things like drama is very popular.
If you're watching a sports game, I can see that having a second stream (perhaps curated) with easy to access stats on that game, or a different angle to what the director thinks you want, or whatever.
I don't see the appeal of second stream in drama, but in things like sport, yes
I stream a lot of sports from home. I would gladly trade delays for stream stability/quality. I frequently have to switch from a legal stream that I pay for to a more robust illegal stream.
It would be annoying if there were multiple devices nearby on different delays, but for me in my living room. I don't care if it's 30 seconds or 3 minutes. I've gotten spoiled by twitter feeds a few times, but it's not the end of the world.
"Live" TV broadcasts are also on a delay, so I guess it's all relative. I had a friend who lived near enough to an NFL stadium that you could hear when something big happened and the TV delay made it impossible to enjoy a home game there.
Very minor delay in the UK -- well under 2 seconds from pitch to TV on DTT in the case of the FA Cup. Sure enough to cause some grief when you can hear the crowd, but not enough to get notifications on twitter or whatever.
Interesting factoid, but some press have agreed to intentional latency as a security feature around certain events and dignitaries. The theory being that a sniper or drone operator cannot use the "live TV" footage to know their exact position.
In some cases international viewers may see the "live" footage before local ones.
> The theory being that a sniper or drone operator cannot use the "live TV" footage to know their exact position.
How is this helpful in either of these cases? A sniper needs eyes on, a drone operator probably has live video from the drone. I don't see how a TV delay would have any effect.
Already since over 20 years, the german TV chanel "RTL" is about 15 seconds later than all other TV chanels on live events like formula 1. But I have no idea why it is.
We use the internet (and IP in general) to stream video. At high bitrates (200mbit+) we aim for sub 100ms end to end, for compressed services we're happy with 500ms, maybe upto a second if it's something like Sydney to London over the internet.
I was in a control room a couple of weeks ago watching some football. There were two displays, one end was the feed from the stadium, one was the feed from the web streaming service.
There were cheers and then groans from the live end of the room. nearly a minute later someone on the web end started running up the field to score. Of course I knew at that point that it wouldn't be a goal, as not only did the people watching the live stream tell me, but twitter was abuzz.
1 minute end to end delivery latency is shocking for this type of program. Heck 10 seconds is bad enough.