Future of Broadcast and Multimedia Analyzers
Ian Valentine, Video Products and Strategy Director at Tektronix Video, which is now part of Telestream.
Before considering broadcast and multimedia analyzers specifically, we need to take a step back and consider what is happening in the content creation and content delivery arenas as providers strive to deliver a great experience to consumers, on the consumers terms, on a variety of end devices. Choice, an increasingly difficult battle to win and retain viewers, and an accelerating rate of deployment of new technologies is creating new challenges for broadcast and multimedia providers. As a consequence, Test and Measurement solutions need to evolve to address and solve new problems to ensure a great and consistent experience for customers can be created and delivered as efficiently as possible.
On the content creation end of the process there is a continuous drive to originate content (live or episodic) in the highest resolution possible. This translates to 4K (or UHD) resolutions, the use of wide color gamuts (WCG), and high dynamic range (HDR) technology. These technologies combine to create increasingly realistic images on the latest displays being used by consumers. However, for content creators these technologies have brought with them a series of new problems. For example, how do you ensure that the content being created will look good on HDR and SDR displays, that skin tones and colors are acceptable regardless of the display used, and that the HDR highlights do not exceed the capabilities of the screen or are so bright and cover large areas of the screen leading to a very uncomfortable viewing experience?
These challenges are driving changes in the production workflow and the need for a new range of tools to help creators get this right. Traditionally waveform monitors are the instrument of choice for this task, but waveform monitors themselves need to evolve and provide new tools and analysis capabilities to address these challenges. The standard tools used on these instruments in production include a waveform display to ensure that luminance levels are within specification, a vector display to check chroma levels and skin tones and a picture display to provide a representation of the final product. However the use of WCG and HDR make these tools difficult use in production and post production – traditional waveform displays dealing with HDR will crop highlights and crush black levels, using vector displays to measure WCG such as BT 2020 is difficult as the display shrinks to deal with the wider color space and the traditional picture display will appear washed out when fed HDR content. This means that new displays that support a new workflow are required – waveform displays that work in light levels (Nits) not mV or percentages so that there is a direct correlation to the HDR image and camera aperture settings, a CIE chart to display and measure colors in multiple color spaces and a false color picture image that can be used to display HDR highlights, what area of the image is at HDR levels and where those highlights are. The problem is exacerbated by the use of multiple HDR standards (e.g. HDR10, Dolby Vision, Hybrid Log Gamma) and camera gamma curves (e.g. Slog 3, Log C) and so the modern waveform monitor needs to be able to handle all of these quickly and efficiently.
Having created the content it needs to be delivered to the consumer. The biggest and most disruptive trend here is the use of Adaptive Bit Rate (ABR) or streaming technologies to deliver the content to a wide variety of display types (phones, tablets, high-resolution televisions) with different capabilities. Unlike traditional broadcast or linear cable systems where content is pushed from the broadcaster, streaming moves to a pull model where content is pulled at an individual level to individual devices and so the need for IT like environment scaling is essential. Any monitor or analyzer in this environment needs to be able to scale in a similar way. It is also worth mentioning that increasingly these solutions are being deployed as virtual or cloud solutions (both on premise and public cloud) and so efficient mechanisms for deploying monitoring solutions in this environment is essential and needs to be easy to setup, configure and control.
ABR is used to deal with the variations in the available bandwidth to which these displays may be connected to. ABR itself relies on the content being chunked into (usually) 2 second segments and multiple profiles of different bitrates so that a continuous and acceptable viewing experience can be achieved without the end device stuttering or freezing as the available bandwidth varies. Where all these segments and profiles are stored on the network is defined in a manifest file that is used to select and send the correct profile to the end device. This means that ABR monitors or analyzers need to be able to check the manifest file structure and actually check that the specified content is where it is defined to be. If the content is there the analyzer need to ensure that the bit rate is correct for that profile and should also check that the actual quality of the content is acceptable. This can be difficult to do in most streaming systems as the packager used to chunk the content will also encrypt it for protection purposes. This means that modern ABR analyzers need to be able to handle a variety of Encryption and DRM systems to be able to provide quality of experience measurements. To avoid stuttering and jumps it is essential that all of the chunks in the various profiles are accurately aligned so that there is a smooth transition between profiles as the display device moves between profiles. Today’s analyzer not only needs to check that the various profiles are available, but also need to ensure that all of the chunks are accurately aligned.
Advertising remains the major method for monetizing content and todays technologies allow advertisers to provided targeted adverts that are more relevant to individual viewers. These adverts are dynamically inserted from a number of sources on the network. It is vital that these adverts are sourced from the right place and inserted into the content at the right time and place. This means that ABR analyzers need to ensure that like SCTE35 markers are inserted at the right point in the content (to mark insertion points for adverts) and that those adverts actually get inserted into the content. Beyond this, these monitors are also required to ensure that delivered content meets the regulatory requirements (loudness measurements and the Closed Captioning) and provide a whole range of quality measurements (MOS scores, frozen frames, audio mutes etc.) for the delivered content and for most operators this needs to be done as a background task so that they are only informed when there are issues i.e. monitoring by exception.
It is worth mentioning that the link between content creation and content delivery is often a file based environment where content is stored until required for playout. Here file based analyzers can be used to perform in depth analysis of the content stored to ensure that any transcoding or audio/video processing has not negatively impacted the content, that is will play correctly when delivered and meets required delivery specifications for example the Netflix and DPP delivery specifications. File based solutions can also provide vital MOS measurements, check alignment, HDR levels and metadata along with a wide range of syntactic and semantic tests of the content.
Telestream Expertise!
Telestream has a wide range of products to address measurement and analysis for content creation, content delivery and file based content. Key products are:
PRISM – Media monitoring and analysis
Aurora – File Based Analysis
Vidchecker – File Based Analysis
Surveyor ABR – ABR Monitor
OptiQ Monitor – Channel and Monitoring cloud framework.