Geneva in late March doesn’t feel as wild as a Las Vegas boulevard, but still there was a reasonable amount of gaming excitement at the EBU headquarters, where several events were following each other, beginning with a DASH Interoperability Forum meeting which was closely followed by the BroadThinking event, where industry actors and broadcasters come to show off their latest implementations advances and/or share the results of their real-life deployments or research studies. A good way of managing the transition between TV Connect and NAB…
Where Broadcast meets Broadband : that’s the promise of the EBU’s hybrid event which flies between industry competition, standardization efforts and broadcasters’ realpolitik – all wrapped in a warm and funny ambiance provided by the various speakers and the EBU team gently lead by Bram Tullemans whom I’d like to thanks personnally here for the invitation to present (kudos also to Filka, Peter and Eoghan for the organization!). Actually the 2013 edition was a major success because it allowed the participants to get a rather good idea of the general trends of the industry, and at the same time to go deep in technology when needed, while having opportunities to discover edge tech demos on the lobby attending the conference room.
It’s virtually impossible to render a complete report of everything that has been said or shown there during two days by so many quality speakers (including OnlineVideoFrenchSquad group distinguished members Lionel Bringuier [Elemental Technologies], Martin Boronski [M6 Web], Thierry Fautier [Harmonic] and Nicolas Weil [Challenge2Media] – Les 4 Mousquetaires /poke @sfaure ), but I’ll nevertheless try to provide you here a selection of relevant informations that will help you grasp the trends and prepare the upcoming tradeshows efficiently…
So let’s start with a recap of the most interesting DAY 1 presentations !
Session “BROADBAND DELIVERY – THE BIG PICTURE”
First half-day was concentrating on the general aspects of the broadcast markets in Europe and the impacts of audience behaviors evolution. The first main issue was isolated by Guy Bisson [HIS] in his “Broadcast vs. online delivery – How do the costs stack-up?” presentation : OTT audience explosion answered by unicast distribution comes with a cost which is barely interesting compared to broadcast distribution, as soon as the simultaneous users are 8.000 in SD and 4.500 in HD. Meaning that in theory, full OTT distribution might be the best option for distribution of the niche thematics – an assumption relativized by the fact that OTT costs are always added over traditional broadcast costs as an extra expense and technical bunch of issues. But OTT distribution is no more an option, considering viewers’ appetite for consumption on tablets and various OTT-only devices, so the challenge here remains to cope with the generated burden while waiting for more optimized IP Multicast distribution strategies.
During her presentation about “The challenge of increasing online services while attempting to reduce the UK’s energy and traffic footprints“, Janet West [BBC] outlined the contradiction for the UK to increase the reach of online services while reducing energy and traffic footprints. With an exploding electricity consumption directly linked to consumer electronics usage, carbon emissions need to be restricted by optimizing datacenter infrastructure power policies, because by nature online distribution uses far more energy than traditional DTT distribution. Janet presented an additional number of relevant measures that can be taken on top of datacenters electricity consumption, such as “change from always on to always available”, “better manage peaks of usage and anticipate off-peak downloads” or “avoid subscriptions where the next program can only be downloaded after the first program has been watched”. It was interesting to note that this green concern stayed a very strong idea after this presentation, throughout many examples of possible optimizations both in OTT production and distribution workflows.
Session “ESSENTIAL TRENDS IN ONLINE DISTRIBUTION”
Right afterwards, I was the first speaker of the afternoon session – representing Challenge2Media, the French consulting company I was working for at that time, and bringing up a study about Multiscreen services architecture which was intending to outline the major challenges met while building such services, show some industry trends that aim to solve it and illustrate with some state of the art examples. Starting from a general presentation of multiscreen platforms based on the example of the fully-fledged Tvinci platform architecture, I divided the scope in five challenging zones (content preparation, DRM workflow, video CDN, frontend development and service platform) and ran through the challenges/solutions tuples throughout this agenda and in a multi-vendor approach. That’s what I’m describing right below with shamelessly much details than for the other presentations, as I know that’s a good spot for it…
As regards content preparation, the first focus of my presentation was placed on hybrid workflows combining ground and cloud processing in order to absorb production peaks, illustrated by Elemental’s latest advances in this unified processing model (which Lionel Bringuier detailed accurately right after my presentation – great NAB demos and product news in perspective…) – a major breakthrough for many production chains which are totally rigid and non-scalable today, despite a strong need to produce always more contents in more heterogeneous devices formats. As another option to handle flexibility needs, I presented the latest advances of the Windows Media Azure Services platform, which allows now to build fully cloud-based workflows for both live and VOD production uses cases, while guaranteeing very high security standards with a well-defined API-based pre-encryption procedure of the source contents prior to ingest on the platform (some hours later, Amazon released a Hardware Security Module service that could also change dramatically the security level that can be provided to video cloud-processing). It was interesting to note that Microsoft’s ecosystem approach started to provide efficient options for multi-DRM policies over various ABR packaging topologies and Common Encryption, and that getting the MPAA certification for the Azure Media Services platform was a major step to achieve.
Still on content preparation, I presented Xstream’s MediaMaker Ingest module which allows to quickly build very versatile and custom workflows – with an availability this summer as Amazon EC2 or on-premises deployment options. Finally, the last focus for this section went on Harmonic’s excellent techniques of virtual asset creation from live ABR sliding recording window in order to create VOD programs with their ProMedia Origin Server.
On the DRM workflow challenges chapter, I presented DRM swapping techniques available on Azure Media Services to efficiently reuse existing PlayReady-encrypted fragments from Smooth Streaming to HLS, the latest matrix of products/services supporting (or soon supporting) Multi-DRMization of Common Encrypted files in an UltraViolet/interoperability production perspective, as well as the per-session encryption techniques supported by Seawell Networks’ Spectrum solution and the latest evolutions in Verimatrix DRM umbrella approach (ongoing Adobe Access and OMA DRM integrations).
In a preemptive echo to day 2 focus on CDNs, I then spent some time on multi-CDN approaches with examples of the latest BroadPeak umbrellaCDN solution (currently in integration with SmartJog, SFR and Limelight CDNs) and the very original approach of DENIVIP Media video load-balancer which is aware of what video fragment is available on what edge cache, thus providing an exceptionally low response time to most end-users’ requests and an interesting way of optimizing efficiency of CDNs.
My next focus was on Seawell Networks Spectrum solution for a wide range of advanced delivery use cases like devices-targeted blackout management, user-targeted advertising, device-differentiated fragmentation modes or buffering reduction with shorter fragments length at program startover, dynamic DASH ad insertion and overall session-based manifest manipulation. With individualized encryption/DRMization on the edges and great ABR repackaging options, this solution based on a Session Delivery Controller is really impressive by its flexibility to adapt to new scenarios where more intelligence is required for OTT delivery.
This trend of offloading more and more processing activities towards the edges of the network was slightly confirmed in my last section about the various repackaging options (post-transcode, on the origin server or on the edge) by discussions with industry vendors agreeing on limitations of origin-server centralized repackaging when content volumes are huge. I presented extensively Toshiba’s ExaEdge solution which I spotted during NAB2012 – the NPEngine behind it being a direct SSD to IP embedded hardware technology allowing to offload from CPUs all network-related actions and dedicate CPU power to value-add features like ABR repackaging, on-the-fly DRMization or advanced logging. With an evenly distributed bandwidth resource management queue, this type of equipment when fully loaded with 4x10Gbps NICs can sustain up to 64.000 concurrent sessions, which is a huge improvement in server efficiency and green processing.
A recent initiative from Toshiba consisted in prototyping an architecture integrating a reverse HTTP proxy and an intelligent HTTP streaming solution provided by Unified Streaming Platform with extensive dynamic contents repackaging and DRMization capability. As with Seawell’s Spectrum, the solution can work from encrypted source files which is a mandatory feature considering studios’ paranoia on this topic. The results of Toshiba’s tests are showcased in a whitepaper that I recommend you to download and read, because this is a real breakthrough in OTT delivery optimization, and even more if you combine it with DENIVIP Media video load-balancer because it will allow you to downsize the necessary repackaging actions to the absolute minimum as the cached already-processed fragments are fully exploited.
On the frontend development challenging zone, I approached the connected TV fragmentation which is adressed by the Smart TV Alliance SDK around HTML5, and some hours after my presentation, the BBC released in open-source its TV application layer which gracefully answers this need of having mutualized development environments for the maximum of devices. On the rising second-screen topic, I talked about Microsoft Xbox SmartGlass technology which appears now as the most rational and solid approach to synchronize contents with video playback (done on Xbox only), although it’s still an emerging practice due to the cost of contents production and the necessary user attention which is not very compatible with premium contents. Still, this technology is now deployed on more than 35 movies and 13 apps (HBO GO/ESPN/NBC News/NBA Game Time…), and popular shows like Game of Thrones season 3 will make heavy use of it to target Android and iOS devices.
In the fifth challenging zone, the Service Platform by itself, I mainly outlined for scalability issues the trend seeing service providers like SyncTV or Xstream make extensive use of public cloud infrastructures like Amazon EC2 with appropriate auto-scaling mechanisms like Heroku. Another portion was focused on Tvinci’s platform special capabilities around multi-tenant distribution, multi user subscriber-profiles, business rules combination flexibility or AB testing – all sorts of features which are crucial for new generation premium OTT services. Last sub-topic was envisioning UtraViolet locker integration inside the service platform, with emerging offers from castLabs, GSGI, Akamai and Solekai Systems. The market demand for such services is still relative outside the US market but it’s worth noting that the requirement is growing on content distributors’ side and that all OTT platform will have to be compliant sooner or later.
Finally, I closed my speach with architecture agility considerations – going through extensibility with specialized services like BeBanjo Movida and MediaMorph, facilitated integration through standard connectors for common services like payment gateways, growing trend of service providers like Tvinci and Xstream providing SCRUM based development teams and procedures to accelerate off-roadmap service customization, FIMS-based workflows for injecting long-term architecture standardization advantages, and finally the impressive service orchestration (BPM engines, services registry, SOA frameworks) and scalability/cloud governance toolset released in open-source by Netflix every day – a must have solutions ensemble for ensuring your platform agility in the long run. My topic was quite a wide/dense one, but I think also a useful introduction for upcoming speaker from Elemental Technologies and BroadPeak who were later presenting their latest developments and vision on their respective markets.
In his “New screens, new delivery networks, new ecosystems…” presentation, Lionel Bringuier explained Elemental’s approach in order to manage unpredictable demand through dynamic addition of CPU/GPU cloud engines transcoding capacity to an existing on-premises production infrastructure in a unified, transparent and dynamically scaling hybrid workflow. 2013 shall see all Elemental products available in the Amazon EC2 infrastructure as they are available on the ground now, for example with the Elemental Live Cloud Edition getting out of beta during NAB. Their next major release 2.1 has already been announced for May and will bring up the much awaited DASH ABR packaging and HEVC encoding capabilities.
David Price from Ericsson had an interesting presentation on “HEVC and MPEG-DASH: Hybrid Solutions for Delivering Personalized Multi-Screen Services“, with a deployment forecast predicting mainstream status for LTE smartphones and Smart TVs in 2015, tablets/HD reach extension, 4K HEVC contribution and 4K HEVC TV in 2016 – while 8K HEVC TV not being seen as realistic before 2020. David also recapped he current adoption status (Broadcom BCM 7445 chipset coming in 2014, Qualcomm SnapDragon and Samsung F8500 plasmas) while remembering severe limitations to the widespread of the technology right now, like the x10 computational complexity requirements on the encoding side. On the DASH topic it was interesting to hear about YouTube experimenting with DASH MPDs for both AVC and Web and the potential drop of Flash as the reference streaming technology for this service. Ericsson is investing in priority on developments focused on HEVC usage combined with eMBMS (evolved Multimedia Broadcast Multicast Service) mobile transport technology – which will be demoed at NAB on the SVP5500 video processor. It might be noted that an important part of the conference audience was raising questions about the match of eMBMS with 24/7 diffusion use cases and cellular backhaul network architecture and dimensioning as they are existing today. Yet another subject for additional research work, it seems…
Thomas Stockhammer from Qualcomm then provided a very complete “DASH Developments: the essential update” presentation on ISO status of MPEG-DASH, especially on topics which DASH does not specify (like codecs, DRM systems, transport protocol details or comprehensive auxiliary metadata…) – as well as on the interoperability steps engaged by the DASH Industry Forum around the DASH264 specification: HD and HEVC video testing cases are expected to be completed by end of September this year. As advanced topics of high interest, Thomas went into some ongoing core experiments considerations (like DASH Push Events or low-latency live streaming) and a big presentation section about the latest 3GPP efforts on bridging DASH with MBMS, eventually using hybrid modes with unicast – a definitely expert topic.
In the “Metadata – the missing link” topic, Alexander Adolf [Condition-ALPHA] developed mainly the topics of HTML5 microdata, Data models and classification schemes, while Jean-Pierre Evain [EBU] exposed in his “Metadata models for online distribution” presentation the contribution from BBC and EBU in Schema.org and induced the broadcasters to deploy imagination to reduce their dependency on EPG providers through experimentations taking ideas from search engines and social media data architecture. Quite a proactive imprecation but the situation seems to be rather locked on the metadata front for the moment… Finally, Jose Velazquez from Kaltura enlightened us in his “HTML5 – Unifying Video Delivery” presentation, synthetizing why HTML5 can be a good unifying video delivery mean – but also pointing out its limits (adaptive streaming, fullscreen, DRM, codecs fragmentation…) and the fact that many things still have to be built directly by developers for the moment, outlining a finally not so mature status of HTML5 as a technical socle for all use cases.
That was a really fascinating day, with all the latest updates on burning technology advances (like DASH/HEVC/eMBMS/dynamic repackaging/hybrid workflows…) and standardization efforts, showing that yet if industry actors are still trying to preserve their own competitive advantage over their competitors, they are clearly more open now than ten years ago to collaborate and define common norms with their industry colleagues. Nothing negative here for broadcasters apart that things are taking more time as they are more discussed than before, and that sometimes a “by the bottom technical leveling” can occur compared to their craziest expectations…
If you are still hungry for more CDN, Hybrid TV and DEMOS stories,
then you must check the part 2 of this report !