EBU BroadThinking 2016 – The Year When Broadcasters Became Multidisciplinary Scientists // DAY 1 Report

EBU’s BroadThinking conference is always a good spot to prepare a downstream NAB, IBC or HbbTV Symposium later in the year, as this conference allows spotting the challenges, the limitations and the false promises of the never ending video technology shift. When you arrive in Vegas straight after a BroadThinking like the 2016 one, you’re obviously excited as there’s a lot of interesting new stuff emerging – but you’re also warned that only a few of these shiny new technologies will be deployed on the short term because of complexity, maturity or even relevance reasons. Passing through BroadThinking’s X-Ray, innovation gets slimmer, but also safer for early implementers, as the complexity is tamed a bit via knowledge sharing and guild discussions.

EBU BroadThinking 2016

But this year though, we conscientized that producing a modern multiscreen video experience isn’t just about embracing the most mature emerging technologies, year after year, in a smooth granular evolution. Nowadays it actually requires mastering a lot of expertise that are less and less native to broadcasters, all kinds of high-end activities like big data analysis, complex IT/network deployments, fine-grain ABR heuristics mastering, panoramic video, user behavior analysis, security policies enforcement or cloud applications development [and a lot more]. Yes, you’re not a broadcaster anymore, you’re an accomplished multidisciplinary scientist…


NPO's Platform
NPO’s Platform

This was quite well summarized in NPO’s R&D Manager Egon Verharen excellent keynote. NPO is a public broadcaster in the Netherlands, where 80% of the population is over 100Mbps and thus hungry for OTT services. Throughout the years NPO has grown its internal platform to support multiscreen delivery to a lot of devices – including connected TVs and HbbTV, ranging from live services to on-demand on npo.nl and nos.nl and SVOD on NLZiet. The resulting architecture is complex and hard to evolve, given the number of delivery formats, with many challenges on platform growth sustainability, end-to-end video quality, content protection or statistics. So far NPO has been tackling the bigger and bigger multiscreen challenge by doing everything internally, but for example with multiple DRM to deal with, the IT department is required to provide expertise outside of the usual track for broadcasters. With the overall architecture getting too heavy and complex, NPO is exploring outsourcing options, as it once was done for broadcast playout.

Egon shared how they overcome some of the challenges they faced, starting with live streaming to Android devices (“still a big mess”) solved in 2015 through MPEG-DASH produced by Unified Origin and played on an ExoPlayer-based app. They used the DVB-DASH profile as it allowed them to easily switch from live to VOD. The DASH implementation took them three months from experiment to production, with no change afterwards. Egon’s strong conclusion on this scope was “Start using DASH, it’s easier than you think!“.


DASH client seamless switching
DASH client seamless switching

Well, not as easy as HLS, after all, but that was a good introduction to the DASH section of the conference, starting with IRT’s Martin Schmalohr who walked us through the requirements of producing DASH for multiple HbbTV platforms, namely v.1.5 with HbbTV-DASH & v.2.0 with DVB-DASH. One major difference for those two compared to the rest of DASH profiles used on desktop and mobile is the need for a unique initialization segment common to all representations in one adaption set. Martin pointed out several possible problems regarding the mutualization of streams between various decoders: broadcast resolutions and framerates of HbbTV 1.5 might be difficult to support on PCs, while HbbTV 2.0’s 1080p requires a L4.2 which might be difficult to support with many Android decoders. The seamless switching between resolutions, bitrate and codecs inside a DASH client was also flagged as challenging. Not easy also, the production of subtitles in EBU-TT-D for DVB-DASH (subset of W3C TTML, segmented and packed into MP4 container), as shown by IRT with several partners on http://subtitling.irt.de – but those subtitles can also be played by dash.js and bitmovin desktop players, and do allow dynamic – accessibility-centric – changes in size/color/opacity/position… In the Nordics, TV channels like NRK have heavily adopted EBU-TT-D as their standard subtitles format.

A bunch of problems were also exposed by Martin regarding IRT experimentations with DASH on HbbTV : 3 million German devices out of 10 might expose a “HbbTV 1.2.1” User Agent signature, whereas they actually support HbbTV 1.5, 1 to 2 million devices in Germany could be DASH-disabled and finally – but that’s a problem I spotted also in France back in 2015 – some terminals have serious problems with live manifests. On this point, I can testify that I have seen 2015 Hbb TVs playing live MPDs as if it was on-demand ones, starting at the first second of the DVR window instead of the live edge time. I’ve seen other TVs jumping in time by 56 days after receiving a few 404 errors on segments requests, or some starting the playback session well and then playing only the audio or even freezing totally the playback while still loading segments normally. Debugging players by wiresharking their requests was the way to trace what was happening in the background, as the Hbb player engines are a kind of black-box in terms of logging. Not quite a reverse engineering exercise, but still a complex investigation one …

Despite some problems with DASH implementations on platforms like HbbTV, the DASH standard is keeping up with improvements and completion in terms of missing features. Thomas Stockhammer from Qualcomm gave us a good overview of what’s happening at DASH-IF to strengthen the standard, starting with a specific focus on live services recommendations (inband events, client-server sync, multi-period content, redundancy and failure signaling) aiming at improving end-to-end latency and provide the necessary reliability level that broadcasters are expecting in order to use DASH as a 24×7 delivery medium. Ad insertion was the second main attention point for DASH-IF since the IOP v3.0, with a focus on server-side ad insertion with xlink. A substantial amount of work has also been done on the test vectors database, which overhauled version went live in September on http://testassets.dashif.org, as well as on the dash.js player with a major architecture refactoring in v2.0 that was expected for production-grade live services. Since BroadThinking, dash.js progress up onto v2.4 have provided significant improvements with the introduction of the buffer-monitoring ABR algorithm BOLA (instead of network-monitoring) and the fast-switching mechanism that flushes the forward/backward buffers as soon as a segment ends, thus shortening the bitrate ramp-up to its minimum duration. On the DASH-IF side, work went on heavily after BroadThinking, with the release of the ATSC DASH Profile (used in both unicast & multicast delivery scenarios), the preparation of the v4.0 IOPs which introduce HEVC UHD in a baseline profile together with a companion PQ10 HDR profile, and the work on standardized QoE metrics for DASH and the companion reference architecture. Quite a solid 2016 agenda for the growing DASH-IF organization, but also a lot of DASH sub-topics to be followed by broadcasters – which can be challenging.


CMAF convergence
CMAF convergence

Stockhammer’s talk was also one of the first times when MPEG’s Common Media Application Format (CMAF) effort was presented to the public. This initiative, launched in 2015 by Apple and Microsoft, aims at mutualizing the media segments between HLS playlists and DASH manifests and thus reaching the perfect rationalization of media workflows. It gained a bit of visibility by entering the 114th MPEG meeting’s agenda in February, and was the topic of several Adhoc MPEG meetings throughout the year, with passionate discussions on how to combine both approaches in a single media format. The challenge has been solved and a Draft International Standard has been finalized at MPEG #116 a few days ago. CMAF is expected to reach the status of International Standard in Q3 2017, but the current specification in DIS state allows engineers to implement safely. Apple has already released an early implementation under the feature label “Fragmented MPEG-4” in macOS 10.12, iOS 10 and tvOS 10.

CMAF is very close to DASH as we know it today, with a few more restrictions on the ISO-BMFF container, and leverages MPEG existing technologies as AVC, HEVC (up to UHD) and AAC – which shall provide a good interoperability basis for the years to come. Even the subtitles topic has been addressed with the adoption of IMSC1 (which EBU-TT-D is a subset of) as the sole technical candidate. CMAF is also bringing to the mix an interesting feature to the mix, with the Low Latency CMAF Chunks: these chunks are subparts of the media segments, and can be used on the player side to start playback while the whole segment is not yet fully loaded. Combined to very short segment durations, this has the potential to bring down the average latency of our future streams to a few seconds, compared to the 15-30 seconds that we usually see today.

Only the encryption part still prevents CMAF to be fully interoperable as Apple is still focused on AES CBCs encryption scheme (mapped to the cbcs CMAF Presentation Profile) for FairPlay DRM, while the rest of the industry is using AES CTR (CMAF’s cenc Presentation Profile) with the likes of Widevine, PlayReady and most other DRMs: only in-the-clear contents can so far benefit from the CMAF segments mutualization promise. Which might still be beneficial to broadcasters when applied to many OTT channels that they distribute.


SAND channel signaling
SAND channel signaling

Mary-Luc Champel from Technicolor was presenting on “Enhancing Quality with Server And Network assisted DASH (SAND)”. SAND is a MPEG experimentation aiming at introducing a standard messaging stack between DASH clients and network nodes, in order to provide real-time performance information across the chain and finally maximize the efficiency of streaming sessions. While the principles and the reference architecture of SAND had already been exposed before this BroadThinking edition, I think it was the first time we saw the details of the sand namespace, the alternative transport HTTP headers or the status messages explained. The second part of the presentation, with some illustrated use cases for SAND – optimal bandwidth sharing through caching of segments on the home gateway or standard/premium categorization of the delivered contents, as well as metrics reporting to improve DASH client choices – was quite fresh. The main questions pending for SAND seem to be the overlap with existing QoS metrics defined straight into the core DASH spec, and the challenge of implementing the SAND messaging everywhere in the chain, while many proprietary implementations of the QoS mechanisms are done on the CDN side, on the players side and across all kind of intermediary network equipment. As it is not directly related to SDN, SAND will probably have a hard time finding its way in the ecosystem, at least in unmanaged network scenarios. Another topic to follow…

Recommendation & Accessibility

EBU recommendation platform
EBU recommendation platform

Out of BroadThinking’s first conference day remaining presentations, I guess that one the most useful was the one of EBU’s Michael Barroso who was pitching two collaborative developments done with many European broadcasters in the perspective of providing cross-devices recommendations to end-users. The first project is a cross-platform authentication protocol (standardized in April by ETSI as TS 103 407) and the second one is the reference recommendation platform architecture, which relies on open-source solutions like Apache Spark & Flume, RabbitMQ or Docker, and a bunch of APIs developed and hosted on ebu.io cloud and on AWS. The result is a complex platform able to craft several kinds of complementary recommendations: automated, editorial and social – which is basically a service worth hundreds of thousands of bucks if your only option is to buy a ready to go solution on the market. As a broadcaster, if you have teams experimented with cluster computing, you can follow this DIY approach and possibly save a lot of budget – with more operational risks indeed, but if recommendation is not an absolutely core service in your offer, then the tradeoff might be acceptable.

EBU-TT-D subtitling for legacy HbbTV
EBU-TT-D subtitling for legacy HbbTV

In the accessibility session, my favorite presentation was the one done by Francesc Mas Peinado from CCMA – a public audiovisual organization of Catalunya being involved in the HBB4all European project, and working on Live & VOD multiaudio DASH streaming and EBU-TT-D subtitles for VOD contents. Among various challenges solved during the second project, they came with a nifty approach to make HbbTV 2.0’s EBU-TT-D subtitles backward compatible with “legacy” HbbTV 1.0 & 1.5 devices: they developed a javascript plugin that can parse the EBU-TT-D file and transform it into html & CSS contents that are easy to render by older TV sets. However, they did hit some hardware limitations as some devices are using not very precise clocks which make subtitles render time a bit random, or some devices where CPU power is not sufficient to cope with the additional javascript load. For those devices where rendering is not accurate or even possible, the EBU-TT-D subtitles are just not available, and that’s representing 15% of the devices that CCMA tested. On the 85% of compatible TVs, end-users can benefit from the font resizing, position and background color features brought by EBU-TT-D, which is a significant improvement in the subtitling world. The project has just been open-sourced and is now available to all broadcasters for further implementations, which is great. But it does also flag a recurring problem of HbbTV: the need to build workarounds for older TV generations when a new technology is integrated in the latest spec version, as the TVs are not upgraded by the manufacturers (see the explanation here) and broadcasters are facing a fragmented devices landscape in many countries.

Stay tuned for the second part of this debrief with more BroadThinking 2016’s DAY 2 insights on HbbTV evolution
(augmented with the complementary HbbTV Symposium main facts & figures), VR and OTT players development!

About The Author: Nicolas Weil

Digital Media Solutions Architect, France, World. Hungry for : OTT architecture challenges, MPEG-DASH experiments, hybrid video services, scalable production/distribution platforms, video-centric innovations & Junglist vibes. Proud member and co-founder of OVFSquad, now transformed into Paris Video Tech ! I'm working at AWS Elemental, but this blog reflects strictly personal views, and isn't endorsed in any way by AWS Elemental.


four × two =

Leave a Reply

Your email address will not be published. Required fields are marked *

one + three =