Farewell ‘Full HD’, Forget 4K, Make Way for Ultra High Definition

UHDTV Things have been quiet on the video front for a little while, nothing too major. But in the past week there were two bits of news which give us a look into the future. The biggest, literally, is news on UHDTV. That’s Ultra High Definition TV. While ‘Full HD’ is 1080p, aka 1920×1080, with just over two million pixels, and 4K encompasses several resolutons – from 4096×1714 (around seven million pixels) to 4096×3112 (nearly 13 million pixels), UHD is 7680×4320 or just over 33 million pixels. It has 16 times the pixels of Full HD video. Broadcasting & Cable reports that the ITU Study Group on Broadcasting Services has reached agreement on the most pertinent technical aspects of the UHDTV standards.

But don’t expect a UHD sets to be on the market any time soon. For now it will only be used for special events. For example, there are plans to cover some of the 2012 London Olympic Games in UHDTV for display at other venues. If UHD displays do come home, they most likely won’t be for displaying UHD images, but rather for glasses-free 3D. I was fairly dismissive of glasses-free 3D sets in my rant about the current state of the industry a little while back, mainly because of the compromises required for them to work. But the more pixels on the display, and the more pixels per inch, the more ‘sweet spots’ can be created, allowing for a wider viewing angle. That means less need to sit in specific locations relative to the screen to perceive the 3D effect.

Now, you may think that a 7680×4320 display much be huge, but that’s not necessarily the case. It is a factor of pixels per inch. Consider the iPhone’s retina display with 326ppi. At that density a UHD display would be only 24.5×13.25 inches, that’s only a 27 inch display. Now, that’s an extreme example, but it just goes to show that a UHD display can be scaled to different physical sizes using today’s display technology. Of course, it wouldn’t come cheap.

Ignoring the displays, how would you even get a UHD image? 16 times the pixels would mean, worst case, 16 times the data for the same quality image – all else being equal. Now, in the real world it doesn’t quite work that way since neighboring pixels will share data and it isn’t a linear scale, but let’s go with the worst case for now. How would you ever get that much data? You’d be looking at an 800GB Blu-ray, which is far more than even lab versions have hit.

The good news is everything doesn’t have to be equal. The other recent news is that the Joint Collaborative Team on Video Coding (JCT-VC), a cooperative effort between the International Telecommunication Union and Moving Picture Experts Group, is making progress on the standard for High Efficiency Video Coding, aka H.265. This is expected to be the successor to today’s champ, H.264. The first draft is expected to be released in February 2012, with a final standard in January 2013. H.265 takes advantage of the growth in processor capabilities. It is more computationally intensive than H.264, but should provide compression 25% to 50% better. That could mean that an H.265 UHD image would only have eight times the data as an H.264 1080p image. And with real world compression aspects factored in even less. That starts to bring things into the realm of possibility. Back in 2008 Pioneer demonstrated a Blu-ray disc capable of holding 400GB on 16 25GB layers. And BDXL, which is already a commercial product, holds 100GB/128GB on three or four ~33GB layers. And work is underway to push Blu-ray all the way to 1TB. So by the time UHD displays are viable for the home, we’ll probably have a Blu-ray disc capable of holding UHD content. Streaming? Does your ISP offer Gigabit connections?

But H.265 will have a more immediate impact. You don’t have to be encoding UHD to use it. That 25-50% savings applies to all content. And with mobile data plans all heading to tiered pricing, bandwidth caps on home broadband, and the explosion of streaming HD content, every bit helps. Just think, for the same amount of data you could have up to twice the run time. Of course, even once H.265 is finalized it will probably take a year or two for it to make it into silicon and then into new devices. And you will most likely have to buy a new device to use it. Most devices, like set top boxes and smartphones, implement video decoding in dedicated silicon. So you’d need new hardware to support H.265. Powerful PC CPUs and graphics cards might be able to do it with a software update, but otherwise it means a replacement.

UHDTV from Broadcasting & Cable. H.265 news from Multichannel News via Zatz Not Funny.

About MegaZone

MegaZone is the Editor of Gizmo Lovers and the chief contributor. He's been online since 1989 and active in several generations of 'social media' - mailing lists, USENet groups, web forums, and since 2003, blogging.    MegaZone has a presence on several social platforms: Google+ / Facebook / Twitter / LinkedIn / LiveJournal / Web.    You can also follow Gizmo Lovers on other sites: Blog / Google+ / Facebook / Twitter.
This entry was posted in Blu-ray/HD DVD, HDTV and tagged , , , , , , . Bookmark the permalink.
  • Pingback: Farewell 'Full HD', Forget 4K, Make Way for Ultra High Definition … | www.1080p.it

  • Fanfoot

    Interesting.  Not sure the retina display comparison is all that relevant since the reason it was done is that you hold the phone about arms length from your face and at THAT distance the 300ppi or so means you can’t see the pixels anymore.  Given typical viewing distances for TV’s are in the 6 ft+ range I’m not sure this would ever make economic sense on a TV that small…

    Also we know (and you know) the data rate doesn’t scale linearly with the number of pixels.  If we look at Cable/MPEG-2 VOD streams, a 1920×1080 image has 6 times the number of pixels vs a 720×480 image but typically is encoded at 15Mbps vs. 3.75Mbps so 4X.  Or for an h.264 comparison maybe AT&T’s 7.5Mbps HD vs. 2.5Mbps for SD ratio of 3X…

    Personally I’m more interested in something I might see sooner rather than later, meaning 4K displays.  I’m assuming we’ll actually see some 4K activity at CES this coming January, especially wrt those passive 3D panels.  Would at the very least allow them to deliver “Full HD” resolution despite the loss of half their pixels…

    • http://www.gizmolovers.com/ MegaZone

      I used the retina display as just an example that high resolutions don’t have to mean monster screens.  I don’t think they’d make a full size television with a PPI that high, but a little lower and having displays maybe 40″ or so.  For glasses-free 3D having a high PPI makes sense.  You might display the same data on multiple pixels with multiple left/right pairs shielded to create more viewing angles.

      As for the compression, I did point out that it would be worst case scenario and in the real world it isn’t a linear scaling.  Since compression operates on ‘blocks’ of an image if you take the same image and just scale up the resolution you’ll have larger areas of the higher resolution image that are the same, or similar, and can be better represented in a compressed format.  Just about the worst case for compression is random noise – like an analog tuner receiving nothing, showing random dots.

      • Mitch Album 514

        In addition, this doesn’t really make sense considering that all HD video(bluerays, apple tv, ect) will be stretched out and most likely very pixelated.

        • http://www.gizmolovers.com/ MegaZone

          Actually not, if the pixel density is higher.  Think of it this way – and this is an exaggeration – if you have four smaller pixels packed into the same space as one larger pixel that’d just be 2x PPI.  And if they all displayed the same color, the effect on the eye is identical.  And that’s 4x the pixels density.

          4x PPI would be 16x pixel density.  So 4x PPI means a UHD display the same physical size as a Full HD display but 16x the pixel density.  And that should make for a much *sharper* image, not at all stretched out or pixelated – quite the opposite.

          Just as an SD image on a small HD display doesn’t look stretched out or pixelated – if the image processing is good anyway.  It is only when you *physically* enlarge the display that it starts becoming an issue.  Because then the granularity of the image source becomes an issue.

          A 61″ UHD display showing Full HD content would probably look as good, if not better, than my 61″ Full HD DLP, or any 61″ Full HD display.   Again, provided image processing is good to properly populate those dense pixels.  But if you start making larger displays, the pixels become more evident.  That’s why you need to sit farther away from large displays.  I can sit a couple of feet from a 1080p laptop display and HD video looks good.  But if I sit the same distance from my HDTV I can see pixels.  Ten feet away it looks fine.

          I’ve seen 100+” HDTVs at CES, and at 10 feet I can see pixels.  With a 100″ UHD at ten feet you probably wouldn’t see the pixels, but the image detail (given the same HD source) would be the same.

          Same effect with the retina display being that much sharper than the old display at roughly the same physical size.  They just upped the pixel density.  The same content looks better on the higher density display.