hdrheader.jpgThis illustration attempts to represent the difference between a 4K standard dynamic range (SDR) image and an HDR image. Image: Samsung

In 1934 Marconi-EMI demonstrated the monochrome 405-line electronic television system that was eventually adopted by the BBC and became the basis of public television in the UK.

Since then, television has developed at an ever accelerating pace and is the main driver of computer display technology. Today we have digital colour televisions with 4K resolution, with 8K on the horizon. With image resolution now arguably as high as it ever needs to go, attention has shifted to other aspects of image quality such as colour range (gamut), refresh rates and contrast range or High Dynamic Range (HDR).

HDR for displays should not be confused with HDR photography[1]. Photographers have been working with HDR images for years, but in this case multiple exposures are used to capture high dynamic range and then compressed using HDR editing software into a standard dynamic range image. Many of the most recent smartphones have an option do this trick automatically.

What's so important about High Dynamic Range?

HDR can potentially deliver a bigger impact on image quality than higher resolutions. HDR extends the range of displayed contrast much closer to reality -- far beyond the range we're used to for TV and computer displays. HDR images have deeper blacks, brighter highlights and more detailed shadows and highlights. When coupled with improvements in colour gamut, the result is much more realistic images.

In addition, image quality will be more closely controlled. Until now, image quality control for broadcast television, streamed video and movies on optical disk has been open ended. The image creators output the highest-quality images they can, based on how their content appears on reference

Read more from our friends at ZDNet