Right now there’s a bit of a format war going on that’s reminiscent of the days when Sony’s Blu-ray went head-to-head with Toshiba’s HD DVD to become the standard for HD discs. The confict now is between two video specifications that provide High Dynamic Range (HDR) imagery — what some argue is an even more influential improvement to video than 4k resolution.
HDR does a few things to enhance video by using meta data within a sub-stream. Most importantly, the specification deepens color depth from 8-bits to 10-bits (HDR10) or 12-bits (Dolby Vision). With a wider color gamut video can have more detail in light and dark areas, and an overall more realistic and vibrant image (although depending on a TV’s settings HDR can actually look more hyperrealistic).
The question is whether or not one format will prevail over the other, or will they continue to coexist. A recent study from ABI Research suggests that Dolby Vision may become the standard for digital and on-demand content, while HDR10 may be used more for live events and broadcasts. The reason, ABI Research analyst Knin Sandy Lynn suggests, is that “Dolby Vision currently supports higher light output levels than HDR10 and is better suited to adjust to different manufacturers’ displays.” And, content producers such as HBO, Paramount, Sony Pictures, and Universal are pushing Dolby Vision.
However, Dolby Vision is a proprietary technology that requires IP licensing and a certification process to get the license. Not all manufacturers want to be restricted in this way. On the flip side, HDR10 is an open standard that’s supported by many TV manufacturers including Samsung and LG (although LG has sets that support both formats). The HDR10 is also mandatory on the specification for Ultra HD Blu-ray, while Dolby Vision is optional.
Which format, if any, will dominate HDR TVs a year from now remains to be seen. But it’s hard to imagine either HDR10 or Dolby Vision simply disappearing while this new video standard is just taking off.