it based on the smallest object an eye can see (~100 microns), 20/20 just means you can see 20 point font at 20 feet, being able to see a letter clearly enough to understand what it is at a specific distance is different than actually seeing it, you can see a 5 point font at 20 feet, but you may not be able to tell what letter it is, but you can still see it
I'm aware of what 20/20 vision entails, but, would there be a perceivable difference? I've seen this argued from both sides at different points and do understand the differences in vision, but as far as I'm aware, it shouldn't be physically possible for a person with 20/20 vision to note a tangible difference between the two resolutions - 4k and 1080p, at that range, (on the stated TV size anyway) right?
sure, but thats why you go with PPI and not resolution, 4k and 1080 are still very noticeable on a 40" screen at 10 feet, it's more of a matter of how much you force yourself to notice, and it also depends on what part of your eye is veiwing the screen, your eye is not consistent in its viewing area, there are even differences between gender, for example women can seen more color and have wider peripheral vision while men have sharper center vision and better motion tracking
but you want to slightly exceed what is capable, you need pixels to exceed what is viewable to the eye, at that point, you won't see a difference in clarity between the screen and the wall it is mounted on, but other things will still come into play to let you know that the image is fake, like lighting and color variety
perception is a key factor, but so is contrast, you may not be able spot a single black pixel on a white screen (4k@10ft), but you will certainly see a white pixel on a black background, the eye is an organic object with a highly advanced processor behind it, thats why you can't make a resolution or refresh rate that "the eye can't see" you need to look at how the eye works and how the brain process information, thats why movies are shot at 24fps, they rely not on what the eye can see, but how the brain is going to trick the person into seeing motion, same thing with resolution, lets say a 4k 30" screen is 267PPI, that means that they no longer need to increase its resolution because your eye will not pick up anything more, and it does not matter what distance you view it at, but when you scale that same 4k up to say 60" and view it from 20 feet, your mind will fill in the gaps so that it still looks as clear, but if you decide to get anal about it, you will still be able to spot that it is not as clear as real life background objects
Last edited by Cyanotical; 2013-02-03 at 09:39 PM. Reason: stupid touchpad making typos
So, in the end it has to be assumed that the quality of the picture as far as contrast/brightness/colour etc goes is perfect, then the 1080p and 4k screens should give the same image at a 10ft viewing range, though there will be differences on closer inspection. And it's more the terminology that Geoffrey Morrison used that is the issue?
Out of curiosity, what would your opinion be on the following?:
"iPhone 4, new iPad 3, and MacBook Pro Retina Displays: The new iPad 3 and MacBook Pro have much lower PPIs than the iPhone 4 but Apple correctly markets them as Retina Displays because they are typically held further away from the eyes and therefore still appear "perfectly" sharp at their proper viewing distance. Below we have calculated the viewing distances needed to qualify as a 20/20 Vision Retina Display (defined as 1 arc-minute visual acuity). For a discussion on the difference between the Acuity of the Retina and 20/20 Vision Acuity see this article.
The iPhone 4 with 326 PPI is a Retina Display when viewed from 10.5 inches or more
The new iPad 3 with 264 PPI is a Retina Display when viewed from 13.0 inches or more
The MacBook Pro with 220 PPI is a Retina Display when viewed from 15.6 inches or more
1920x1080 HDTVs: On the other hand, the average viewing distance for living room HDTVs in America is around 7 to 10 feet, depending on the screen size. So to appear "perfectly" sharp with 20/20 Vision like the iPhone 4 Retina Display, HDTVs only need a proportionally much lower PPI in order to achieve "Retina Display" status and have the HDTV appear "perfectly" sharp and at the visual acuity limit of your eyes.
Existing 40 inch 1920x1080 HDTV is a "Retina Display" when viewed from 5.2 feet or more
Existing 50 inch 1920x1080 HDTV is a "Retina Display" when viewed from 6.5 feet or more
Existing 60 inch 1920x1080 HDTV is a "Retina Display" when viewed from 7.8 feet or more
Since the typical HDTV viewing distances are larger than the minimum distances listed above, the HDTVs appear "perfectly" sharp and at the visual acuity limit of your eyes. At the viewing distances listed above the pixels on a 1920x1080 HDTV will not be visible by a person with 20/20 Vision in exactly the same way as the Retina Displays on the iPhone 4, new iPad 3, and MacBook Pro at their viewing distances. So existing 1920x1080 HDTVs are "Retina Displays" in exactly the same way as the existing Apple Retina Display products. If the HDTVs had a higher PPI or a higher pixel resolution your eyes wouldn't be able to see the difference at their proper viewing distances. So existing 1920x1080 HDTVs are already equivalent to what Apple calls a "Retina Display." When Apple launches its own Apple Television it will almost certainly have a resolution of 1920x1080 and it will be a True Retina Display [for humans with 20/20 Vision at standard HDTV viewing distances]."
www.displaymate.com/news.html#7
the problem is that what you view as acceptable and what i view as acceptable are not the same, you can argue this until you are blue in the face, but the only point where a screen will be statistically good enough that the resolution does not need to be increased is when it has a PPI of 267.3 or greater, resolution is pointless, and talking about what X resolution looks like at Y distance is just flooding the screen with irrelevant information and skirts around the issue
for a screen to truly reach the limit, it needs to maintain a pixel size that is at the limit or exceeding the smallest size that can be detected in the eye, if you wanted to increase it just for good measure, use 534PPI or even 1068PPI (would probably be best)
something to consider, sites often post stuff that isn't really true, often because they have a writer that just wants to get paid, and it's very rampant in any area that has some subjectivity to it, audio and video review sites are notorious for this, because its not exactly a hard science that can be disproved with testing
so the best you can do for a screen is look at the smallest object an eye can see, which is 100 microns, and make a screen with pixels that size or smaller, and remember that when you make a bigger screen, you need to increase the number of pixels, otherwise you have to increase the size of the pixels
so, in relation to the cnet article, 4k is not the end, it may be good enough for a 32" computer screen, but if you buy a 84" TV, im still gonna come over and tell you how the picture looks like crap compared to my monitor