Why 4K TVs are stupid in the home

The latest TV technology buzzword is "4K". This magical alphanumeric represents a quadrupling of the now-standard 1080p resolution found on Blu-ray and most high-definition TVs (HDTVs).

A close-up of a plasma TV's pixels.
(Credit: Geoffrey Morrison/CNET)

Have no doubt, manufacturers are going to start pushing 4K (some already are).

The thing is, though, you don't need 4K, because in the home, 4K is stupid.

Check out Ty Pendlebury's 4K primer for more details about what 4K actually is, because I'm going to spend the bulk of this article describing why you don't need it.

As this is going to be a pretty numbers-heavy piece, let me cover the basic terminology upfront. Blu-ray discs, and nearly all modern televisions, are 1080p. This means they have a resolution of 1920x1080 pixels. Get up close to your TV when it's on, you'll see the pixels. They're tiny blocks of red, green and blue (and yellow, if you have certain Sharp LCDs). The image at the top is a close-up of some pixels.

For this entire article, remember we're not talking about content, we're talking about the TV. You can definitely see the difference between content of different resolutions, but that isn't what we're talking about here. We're talking about the HDTV hardware itself.

HD was developed out of necessity given the larger TV screen sizes on the horizon at the time (ooh, 42 inches!). Standard definition, 480i (roughly 640x480 pixels), looks terrible when shown on a screen larger than 28 inches diagonal.

The 4K standard is roughly 4096x2160 pixels. I say roughly as there isn't a decided standard. So if a company says 3840x2160 pixels, that's basically 4K, too (it's more accurately called QFHD, or Quad Full HD). You see, 4K is a cinema standard, and given all the variations in screen widths because of different aspect ratios, it's hard to give one specific number. Suffice it to say, 4K is about double the horizontal and vertical resolution of what you have at home right now.

With the huge screens of most modern movie theatres, and the move toward digital projection, 4K makes a lot of sense. The prevalent 2K (2048x1080 pixels) digital cinema projectors are only slightly higher resolution than 1080p. I've seen a lot of these, and I can often see the pixel structure from most of the seats. You definitely don't with 4K, which is why it's a brilliant idea for movie theatres.

But 4K in the home is stupid. Here's why.

Quad full HD, or 4K, packs in a good deal more pixels than 1080p.
(Credit: CNET)

Brace yourself for some maths

The human eye, for all its amazingness, has a finite resolution. This is why you can read your computer screen from where you're sitting, but not if you're on the other side of the room. Everyone is different, but the average person with 20/20 vision can resolve 1 arcminute. One arcminute is 1/60th a degree. If you assume your field of vision is 180 degrees (it's not, but go with me here), and you take 1 degree of that, you're able to resolve a 1/60th sliver of that degree. Close up this means you can see hairs on your arm, wrinkles on your thumb, and so on. At distance, these fine details disappear. If a friend waves at you from across a field, you can probably see the person's thumbs, but not any wrinkles or hair. Far enough away, you probably won't even be able to see thumbs, unless those are some really, really big thumbs.

One arcminute of resolution is a best-case scenario. On a black on white vision chart, this holds true. Reduce the contrast of the object with the background, add colour, and many other factors limit your ability to resolve resolution.

Your over-resolutioned TV

Let's bring this back to TVs.

Depending on technology, a 1080p 50-inch flat panel TV's pixels are approximately 0.023 inch wide. This is presuming they're square (many aren't) and that there's no intra-pixel distance (there is). The plasma I photographed for the lead image above measured 3 pixels per 1.59mm, which is 0.53mm per pixel. So we're in the ballpark.

Most people sit about 3 metres from their television. At 3 metres (3000mm), your eye can resolve an object 0.889mm wide, if like I said above, there's enough difference between it and the background (or its adjacent pixel, in this case). The memories of the Westwood school system that told me I was bad at maths compels me to show my work, so feel free to check:

2 x pi x 3000mm: 18,849.56mm (circumference of a circle, with you at the centre)

18,849.56mm / 360: 52.36mm (360 degrees in a circle)

52.36mm / 60: 0.872mm (60 minutes in a degree)

This maths, or just looking at your TV, tells you that you can't see individual pixels. What's interesting is that a 720p, 50-inch TV has pixels roughly 0.864mm wide. As in, at a distance of 3 metres, even 720p TVs have pixels too small for your eye to see.

That's right, at 3 metres, your eye can't resolve the difference between otherwise identical 1080p and 720p televisions. Extrapolating this out, you'd have to get a TV at least 77 inches diagonal before you'd start having a pixel visibility problem with 1080p.

Or, you can move closer. Beyond being a maths exercise, let's be realistic. No one's going to sit 2 metres from a big TV. So if we say 2.5 metres, or 0.711mm on the resolution side, this means you'd need a TV that's bigger than 60 inches to really benefit from 1080p.

Is there a size/distance where you can see the difference in detail, below the raw pixel-size numbers? Possibly; it depends a lot on the content, the display and the person. Remember, we're not talking about just being able to see something, we're talking about being able to resolve it. You might be able to see a single pixel-width black line on a white screen from great distance, but two black lines separated by a single white line will appear as a single black line. That's detail, and if you're too far away to see it (or the screen isn't big enough), then it's being wasted.

On the other hand, "seeing pixels" also means seeing the pixel structure around objects, square blocks for curves, that sort of thing. So there is such a thing as too close/too big, but it's much farther/bigger than most people realise.

The real world tends to get even more vague, which we'll get to in a moment.

4K 4 U, K?

So if your eye can't tell the difference between 720p and 1080p on nearly all modern televisions, what's the need for 4K?

Excellent question. There isn't one. Not as far as TVs go, anyway. You'd need a 2160p TV over 154 inches diagonal before you'd be able to see the pixels. On a 4K 50-inch TV, the pixels would be roughly 0.279mm wide.

Where's the crossover where 1080p and 4K become noticeable? It's not exact because of all the above mentioned variables, but suffice it to say at 3 metres, it's somewhere well above 77 inches.

Real world-ish

Let's put this in the real world. I sit 2.7 metres away from a 102-inch screen. At that distance, I can't see the pixel structure of a 1080p projector. If I lean forward a bit, so my eyes are 2.1 to 2.4 metres from the screen, I can see pixels on bright images. If I zoom the projector out to fill all 127.75 inches of my 2.35:1 screen, I can sometimes see pixels depending on the projector.

At this extreme size, and seated far closer than most people would feel comfortable, I would probably be able to see a difference with 4K.

When I reviewed JVC's DLA-X90R, I sure didn't see an increase in resolution. Admittedly, this is far from conclusive, as there's no native 4K content readily available (and the JVC can't accept it even if there was). If I sat about 1.5 metres from the screen, I could just make out the pixels. As more 4K displays come available, I'll see if I can find that sweet spot of viewing distance.

There might be 4K projectors, but 4K content for the home is pretty much non-existent.
(Credit: JVC)

Your home

So that's the advantage of 4K: you can sit way closer to your television (which no one will), or you can get a way bigger television (also unlikely).

When you increase the resolution so significantly (and again, this is all assuming native 4K content, which hasn't been discussed), factors like the contrast ratio, the brightness and, in cases of projectors, the lens and screen material, all become significantly bigger issues.

A few years ago I did a TV face-off with trained TV reviewers and untrained participants with Pioneer's Kuro plasma (768p) against several 1080p LCDs and plasmas. Not one person noticed the Kuro wasn't 1080p. In fact, most lauded it for its detail. Why? Its contrast ratio was so much better than on the other TVs that it appeared to have better resolution. The difference between light and dark is resolution. If that difference is more pronounced, as it is on high-contrast ratio displays, they will have more apparent resolution.

Passive 3D (aka one more thing)

OK, there's one way that 4K is actually a great idea: passive 3D. Current passive 3D displays (from LG, Toshiba and Vizio) are half resolution when viewing 3D. Each eye is getting 1920x540 pixels. They claim your brain combines these into one "full HD" image, but from where I usually sit, I can see lines in the image, so I call foul.

With 4K, though, passive 3D creates a 3840x1080-pixel image per eye, which is more than enough so that you don't see any lines.

Glasses-less (autostereoscopic) 3D displays have the same basic problem, with certain pixels reserved for certain eyeballs. Here, 4K and higher is also a good idea; one could say it's even a requirement.

Conclusion

The 4K standard is already here in the home projector space, more or less. Sony makes a US$25K projector that's native 4K, while JVC has several models with the "e-Shift" pixel upconverter that puts 3840x2160 pixels on screen even though the LCOS chips are 1920x1080 pixels. A case can be made for 4K with larger screens at home. At the moment, though, light output limits screen size far more than resolution. For home projectors, let's just shrug and ask, "OK, why not?"

But with televisions, 4K is stupid. Stupid, stupid, stupid. For every one of you thinking you'll rearrange your living room to sit closer to the screen, I'm positive there are thousands of others who wouldn't (or wouldn't be allowed to).

Sure screen sizes are going up, but how many of you are really going to put an 85-inch screen in your home, and sit close enough to it for 4K to matter?

Don't believe me? Get a chair, and sit close enough to your TV so you can just see the pixel structure. Now watch an entire TV show like that. Now convince your family to do the same.

There's this feeling of inevitability with 4K, like because we can do it, we will do it. I just wanted to point out early that regardless of what the marketing and hype will say, you don't need 4K.

So if someday there's a choice between a 4K 80-inch OLED and a 1080p 80-inch OLED, sure, pick the 4K. Move a little closer to it and presto. But inevitably there will be even smaller 4K displays, and unless you're sitting on top of them, there's no point.

Via CNET



Add Your Comment 18


Post comment as
 

byikl posted a comment   
Australia

ok so im a gamer i plages alot and so i have a tv at the end of my bed for such perposes 4k makes perfect sence since hy mead is litreraely 2 ft from it

 

byikl posted a reply   
Australia

sorry for the shitty spelling

 

pilotga posted a comment   
United States

Even with Blu-ray players that can up-scale to 4K, unless there is the supporting hardware in this case, HDTV that will accept 4K input and the panel with (4096X2160) pixels, and content, what is the use? Who in their right mind would buy a 50" or bigger HDTV and sit close to it to see the pixels? I have a 50" 3D plasma 1080p and I set at least 12 feet from the TV. The picture is so life-like it scares me. Now 3D started out as a great idea, but just like everything else, we need more content, better 3D active glasses. I would like to know if up-scaling
to 4K would the picture appear 3D since the more pixels the more detail, and better depth perception.

 

tsvgeekguy posted a comment   

Of course there is no native content for 4K "yet". Just as there was not content at all for Blu-Ray 10 years ago, and DVD 15 years before that.
with out the ability to show the quality why would anyone spend more money to produce the same quality content.
As a former retail store manager, while there may be no obvious difference in appearance to 98% of the population there will always be the 2% who will buy anything "new" and :improved" weather it actually is improved or not.
it is this striving to improve that separates us from the chimpanzees. if this writer of this article would prefer he can take his wireless and climb back into the trees. i doubt we will notice his absence.

 

HDJudas posted a comment   
Canada

Someone apparently has a bad case of Macular Degeneration of some sort.

I'm not even going to bother pointing out sources, but this article is even with the shoddy math... terrible.

The claim that anything that increases the resolution and detail that can be obtained as being stupid regardless of the "size" of the screen that a specific high resolution is put on and then to claim that no one would see, as being "stupid. Stupid, stupid, stupid" clearly indicates a lack of progression in the right direction, ignores applications of such technology within the homes for graphic artists and creators, not to mention the fact that the argument raised has no real solid validity.

The whole article is based on the assumption that everyone has the same level of "sight", also the same field of view and even when it was clearly stated that "Everyone is different, but the average person with 20/20 vision can resolve 1 arcminute.". Not to mention that there are hundreds of thousands if not millions of people that sit or stand or lay up close and personal to their big screens. While some people live with near tunnel vision due to any numerous natural or unnaturally occurring circumstances, other have exceptionally wide field of vision and retain full clarity and detail being seen at all times.

What he or she or I or the general public can see is up to them to decide even if it's somewhat placebo effect style acceptance or not. There are most definitely some clear benefits that go along with larger format being applied to modern sized screens with the move to larger screens made possible without the terrible effect of looking as bad as standard definition did when being seen on screen exceeding 27". And even then there was always that myth of "don't sit to close to the tv, you'll go blind"... most of which may actually have stemmed from people insisting that we all sit a good distance from the screen. Actually it wasn't at all uncommon for most households to try and ensure that there above 19" TVs remained a good distance away, usually further than we have seen in modern homes today. This was due to the low resolution and interlaced format used, larger screens amplified the blurriness of this issue.

With current 720p and 1080p formats, at an average distance of 3 meters, 720p is acceptable by a good portion of people up to about 32 before some noticeable "lines" and "pixels" start to appear, yes you read right, 720p at 32".

720p on a Monitor used by any average computer user on a screen above 12" isn't acceptable anymore due to the clear lack of overall desktop real estate and the fact that the level of detail on a screen that has a much higher resolution does in fact look much more detailed and clearer.

1080p is a requirement when anything 40" or larger is being used. The line between remaining crisp and clear before experiencing the same issue experienced with 720p occurs at around the 52" and 55" mark, this is amplified with 60 and 65 and much larger screens. This is where "2160p" turns into a requirement.

Venturing back into the monitor realms, where most of the new technology is first adopted. 2560x1600 monitors, which have twice pixel resolution of 1080p, have been available since the introduction of the DELL 30" and Apple cinema Displays. Combine that fact with Eyefinity and Nvidia's similar solution for several additional displays, it isn't hard to find MANY users playing video games at a computer sitting approximately 1 meter from their screen(s) sporting a resolution almost within "2160p". All 3 times the pixels as 1080p, crammed into a viewable space equivalent to that of in the case of 3 1080p 22" monitors in portrait mode, about 32". And the writer of this article is claiming that it's a total waste, stupid, unnecessary.

I think the writer should take a bit of a lesson in dot pitch/pixel density and also figure out how immersion plays a big roll, as well as how many households specially enthusiasts prefer to watch or play on such screens, and why there is a demand for high resolution.

I could also mention a single thing that would further disprove much of what is being said in the article. Modern games that run at the full native resolution of 1080p on lets say a 65" hdtv without Full Screen Anti-aliasing. See, if there is one thing people will notice while playing games on ANY monitor or ANY HDTV, it's that aliasing still occurs and it's only solved by applying anti-aliasing at a high level, or by using a much higher resolution on the same or smaller format screen. Until this is eliminated completely, there will ALWAYS been a considerably and noticeable difference.

 

Cougie posted a comment   
Australia

Dude, I hope your article doesn't slow down the progress of technology. I'm still hoping for a holodeck in my lifetime.

 

Jive Turkey posted a comment   
Australia

"For this entire article, remember we're not talking about content, we're talking about the TV."
"Admittedly, this is far from conclusive, as there's no native 4K content readily available..."
We all know the current gen consoles don't render 4K. That doesn't mean the next gen won't.


Sponsored Links

Recently Viewed Products