Low Latency Sports Streams

Sunday, September 22, 2019 10:35:55 PM

Currently powering over brands globally and available as a low latency sports streams or software for on-premise deployment. Don't hesitate to get in touch! Performance To create an outstanding experience for your users the quality of service needs to be consistent across the board. Speed, stability, compatibility - they are all fundamental for today's products. DALS has got you covered.

Amazon sets livestreaming sports latency goal of ‘significantly less than 5 seconds’

Video Latency in Live Streaming

It's part of our series on video encoders. Many streaming newcomers are used to tools like Skype or FaceTime that allow them to collaborate with others in real-time and feel a lot like talking on the phone or in person.

So why is streaming different? Streaming vs. Although this distinction may seem trivial, it becomes very important when the number of participants or viewers scales to a large number. Keeping the delay between participants low enough for collaboration requires tightly-coupled computing services — and tightly-coupled services do not scale to large numbers of participants.

Video Distribution Service VDS : though a VDS can take many forms, it is essentially responsible for taking one or more incoming streams of video and audio from a broadcaster and presenting it to viewers.

No need for delay: ultra

This includes what is commonly referred to as a Content Delivery Network. Transcoding: the process of decoding an incoming media stream, changing one or more of its parameters e. Adaptive Bitrate Streaming ABR : ensures that viewers on many kinds of devices with different capabilities and varying internet access can smoothly play a media stream.

Does Latency Always Matter? If you assume that your viewers are not sitting in the seats at your live event, then latency may actually not be that important. However, sometimes latency is an issue.

For example, live attendees may be tweeting updates, or you may be providing live score and stat updates for a sporting event. If your latency is too long, viewers may read about something before they see and hear it happen, which is not ideal.

So, we should try to keep the latency as low as possible. If you are choosing a streaming technology and latency is a potential concern for you, the main decision you must make is a tradeoff between scalability and low latency. If low latency is more important, you should choose a technology that provides it, at the cost of potentially inhibiting future viewership growth.

If broad viewership is more important, you should choose a technology that supports broad scalability at the expense of higher latency.

How CMAF is a Key Factor in Reducing OTT Latency

What Causes Latency? More advanced systems such as video mixers will introduce additional latency for decoding, processing, re-encoding, and re-transmitting. Your video capture and processing requirements will determine this value. This latency can range from extremely low thousandths of a second to values closer to the duration of a video frame.

Changing encoding parameters can lower this value at the expense of encoded video quality. Minimum: about 1 millisecond Maximum: about milliseconds Transmission The encoded video takes time to transmit over the Internet to a VDS.

Live Sports Streamers Want Sub Second Latency

This latency is affected by the encoded media bitrate lower bitrate usually means lower latency , the latency and bandwidth of the internet connection, and the proximity over the Internet to the VDS.

Minimum: about milliseconds Maximum: hundreds of milliseconds Jitter Buffer Since the internet is a massively connected series of digital communication routes, the encoded video data may take one of many different routes to the VDS, and this route may change over time.

Because these routes take different amounts of time to traverse and the data may be queued anywhere along the route , it may arrive at the VDS out of order.

  • Whether sports or esports, gigs or cultural events, streaming live has become a steadily more important feature of the OTT landscape, as providers look to differentiate their offerings for the consumer.
  • The audience at the concert venue saw that happen at least 30 seconds before you did.
  • May 22, by Holly Regan In the past, the only way to catch a big game was to tune in on network TV; for smaller events, cable or satellite television was necessary. To keep up with these shifts in consumer behavior, major television providers—such as ESPN, FOX and NBC—now offer live-streaming sports apps, which viewers can use to watch their favorite teams through their smartphone or computer.
  • Video casino: BetOnline Low latency tradeoffs One of the most important things to keep in mind when designing a live streaming experience for viewers is to remember that reducing the latency of an experience comes with sacrifices.
  • Download Hey [applause].

A special software component called a jitter buffer re-orders the arriving data so that it can be properly decoded. When configuring the jitter buffer, one must choose a maximum time boundary inside of which data can be reordered.

This time boundary provides the latency of the jitter buffer. In order to provide a quality viewing experience across a range of devices, a good streaming provider should provide ABR. There are two general ways to accomplish this: either the encoder streams multiple quality levels to the VDS which are directly relayed to viewers , or the encoder sends a single high-quality stream to the VDS, which then transcodes and transrates it to multiple levels.

Typically, the transcoding and transrating takes about as long as a "segment" of encoded video more about segments later , but it can be faster at smaller resolutions and lower bitrates. The two differ on their latency and their scalability.

Understanding these differences is integral to choosing a streaming solution.

Stream live sports with low latency

They can potentially be very low-latency as low as the network latency from the VDS to the viewer ; however, their support for adaptive streaming is spotty at best. Furthermore, scaling these protocols to large numbers of viewers becomes very difficult and expensive. They also have built-in support for adaptive playback, and have more broad native support on mobile devices.

The way these HTTP-based protocols work is by breaking up the continuous media stream into "segments" that are typically seconds long. These segments can then be served to viewers by a standard web server or content distribution network. HTTP-based protocols are generally better suited to most live streaming scenarios due to better feature support and scalability.

However, the disadvantage of these protocols is that the latency is at least as long as the segment length, and can be as bad as times the segment length for example, iOS devices buffer segments before even beginning to play the video. This latency is determined by the capabilities of the viewing device.

Minimum: about 33 milliseconds Maximum: hundreds of milliseconds Putting It Together A streaming solution that uses non-HTTP-based protocols can achieve a lower latency; per our estimates above, latency will likely be in the range of about 1. However, this solution will not scale well beyond about 50— simultaneous viewers.

A streaming solution that uses HTTP-based adaptive bitrate mechanisms will have a slightly higher latency range: about 3. Realistically, it will typically be in the 15—45 second range.

Sports Live Streaming & Live Gaming Broadcasts

Since this approach uses HTTP-based mechanisms that can leverage off-the-shelf CDNs, it can theoretically support a very large number of simultaneous viewers without difficulty. What are my next steps?

Some attributes of your total latency may be within your control. Your encoder settings, the jitter buffer, the transcoding and transrating profiles, and segment duration may be configurable.

At BoxCast, we take great pains to automate as many of these choices as possible to maximize the stream quality and ensure a delightful viewer experience. Preview is a BoxCast feature that lets you see exactly what will be broadcast to the world.

But instead of the normal second delay it takes to prepare your video to be streamed, Preview shows you what your broadcast will look like to your viewers with only a few seconds of latency.

Your email will not be published.. Required fields are marked *

Copyright © 2019