Page 3 of 3 FirstFirst
1
2
3
  1. #41
    Quote Originally Posted by Cyanotical View Post
    no, it doesn't

    lets start with what resolution is, it is the number of pixels resolved in a limited area, it specifically has nothing to do with pixel size or pixel density

    4k on a 100ft screen will still look as bad as 1080 on a 5 ft screen, if you want to defeat the eye, you don't need to look at the sharpness of a persons vision, or the resolution, you need to look at the pixel size and pixel density

    a perfect screen would have a pixel that is about 90 microns across with less then 5 micron gap between pixels, this means a screen needs a pixel per square inch count of 71,486, or 267.3PPI

    that means that a 60" flatscreen needs a resolution of 13964 x 7850, and that's only for a 60", a 70" screen needs a resolution of 16287 x 9158, which is much higher than the 4k or 8k resolution shown on sharps 84" screens at CES

    it's not that this is impossible, we have smartphones and tablets that approach or exceed 267PPI, but the thing to remember is that you need to increase the resolution with the screen size to maintain the PPI

    next time you quote an article, make sure it's not an editorial article written by an idiot who skipped biology class
    I'm curious - at 10 ft viewing distance, would someone with 20/20 vision actually be able to see a difference of say 105ppi and 267ppi on a 40" screen? It seems as though the resolutions you've stated would be more for point-blank viewing range, no?

    Edit:Minor typo.

  2. #42
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by Shinzai View Post
    I'm curious - at 10 ft viewing distance, would someone with 20/20 vision actually be able to see a difference of say 105ppi and 267ppi on a 40" screen? It seems as though the resolutions you've quoted would be more for point-blank viewing range, no?
    it based on the smallest object an eye can see (~100 microns), 20/20 just means you can see 20 point font at 20 feet, being able to see a letter clearly enough to understand what it is at a specific distance is different than actually seeing it, you can see a 5 point font at 20 feet, but you may not be able to tell what letter it is, but you can still see it

  3. #43
    Quote Originally Posted by Cyanotical View Post
    it based on the smallest object an eye can see (~100 microns), 20/20 just means you can see 20 point font at 20 feet, being able to see a letter clearly enough to understand what it is at a specific distance is different than actually seeing it, you can see a 5 point font at 20 feet, but you may not be able to tell what letter it is, but you can still see it
    I'm aware of what 20/20 vision entails, but, would there be a perceivable difference? I've seen this argued from both sides at different points and do understand the differences in vision, but as far as I'm aware, it shouldn't be physically possible for a person with 20/20 vision to note a tangible difference between the two resolutions - 4k and 1080p, at that range, (on the stated TV size anyway) right?

  4. #44
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by Shinzai View Post
    I'm aware of what 20/20 vision entails, but, would there be a perceivable difference? I've seen this argued from both sides at different points and do understand the differences in vision, but as far as I'm aware, it shouldn't be physically possible for a person with 20/20 vision to note a tangible difference between the two resolutions - 4k and 1080p, at that range, (on the stated TV size anyway) right?
    sure, but thats why you go with PPI and not resolution, 4k and 1080 are still very noticeable on a 40" screen at 10 feet, it's more of a matter of how much you force yourself to notice, and it also depends on what part of your eye is veiwing the screen, your eye is not consistent in its viewing area, there are even differences between gender, for example women can seen more color and have wider peripheral vision while men have sharper center vision and better motion tracking


    but you want to slightly exceed what is capable, you need pixels to exceed what is viewable to the eye, at that point, you won't see a difference in clarity between the screen and the wall it is mounted on, but other things will still come into play to let you know that the image is fake, like lighting and color variety

    perception is a key factor, but so is contrast, you may not be able spot a single black pixel on a white screen (4k@10ft), but you will certainly see a white pixel on a black background, the eye is an organic object with a highly advanced processor behind it, thats why you can't make a resolution or refresh rate that "the eye can't see" you need to look at how the eye works and how the brain process information, thats why movies are shot at 24fps, they rely not on what the eye can see, but how the brain is going to trick the person into seeing motion, same thing with resolution, lets say a 4k 30" screen is 267PPI, that means that they no longer need to increase its resolution because your eye will not pick up anything more, and it does not matter what distance you view it at, but when you scale that same 4k up to say 60" and view it from 20 feet, your mind will fill in the gaps so that it still looks as clear, but if you decide to get anal about it, you will still be able to spot that it is not as clear as real life background objects
    Last edited by Cyanotical; 2013-02-03 at 09:39 PM. Reason: stupid touchpad making typos

  5. #45
    Quote Originally Posted by Cyanotical View Post
    sure, but thats why you go with PPI and not resolution, 4k and 1080 are still very noticeable on a 40" screen at 10 feet, it's more of a matter of how much you force yourself to notice, and it also depends on what part of your eye is veiwing the screen, your eye is not consistent in its viewing area, there are even differences between gender, for example women can seen more color and have wider peripheral vision while men have sharper center vision and better motion tracking


    but you want to slightly exceed what is capable, you need pixels to exceed what is viewable to the eye, at that point, you won't see a difference in clarity between the screen and the wall it is mounted on, but other things will still come into play to let you know that the image is fake, like lighting and color variety

    perception is a key factor, but so is contrast, you may not be able spot a single black pixel on a white screen (4k@10ft), but you will certainly see a white pixel on a black background, the eye is an organic object with a highly advanced processor behind it, thats why you can't make a resolution or refresh rate that "the eye can't see" you need to look at how the eye works and how the brain process information, thats why movies are shot at 24fps, they rely not on what the eye can see, but how the brain is going to trick the person into seeing motion, same thing with resolution, lets say a 4k 30" screen is 267PPI, that means that they no longer need to increase its resolution because your eye will not pick up anything more, and it does not matter what distance you view it at, but when you scale that same 4k up to say 60" and view it from 20 feet, your mind will fill in the gaps so that it still looks as clear, but if you decide to get anal about it, you will still be able to spot that it is not as clear as real life background objects
    So, in the end it has to be assumed that the quality of the picture as far as contrast/brightness/colour etc goes is perfect, then the 1080p and 4k screens should give the same image at a 10ft viewing range, though there will be differences on closer inspection. And it's more the terminology that Geoffrey Morrison used that is the issue?

    Out of curiosity, what would your opinion be on the following?:

    "iPhone 4, new iPad 3, and MacBook Pro Retina Displays: The new iPad 3 and MacBook Pro have much lower PPIs than the iPhone 4 but Apple correctly markets them as Retina Displays because they are typically held further away from the eyes and therefore still appear "perfectly" sharp at their proper viewing distance. Below we have calculated the viewing distances needed to qualify as a 20/20 Vision Retina Display (defined as 1 arc-minute visual acuity). For a discussion on the difference between the Acuity of the Retina and 20/20 Vision Acuity see this article.

    The iPhone 4 with 326 PPI is a Retina Display when viewed from 10.5 inches or more
    The new iPad 3 with 264 PPI is a Retina Display when viewed from 13.0 inches or more
    The MacBook Pro with 220 PPI is a Retina Display when viewed from 15.6 inches or more

    1920x1080 HDTVs: On the other hand, the average viewing distance for living room HDTVs in America is around 7 to 10 feet, depending on the screen size. So to appear "perfectly" sharp with 20/20 Vision like the iPhone 4 Retina Display, HDTVs only need a proportionally much lower PPI in order to achieve "Retina Display" status and have the HDTV appear "perfectly" sharp and at the visual acuity limit of your eyes.

    Existing 40 inch 1920x1080 HDTV is a "Retina Display" when viewed from 5.2 feet or more
    Existing 50 inch 1920x1080 HDTV is a "Retina Display" when viewed from 6.5 feet or more
    Existing 60 inch 1920x1080 HDTV is a "Retina Display" when viewed from 7.8 feet or more

    Since the typical HDTV viewing distances are larger than the minimum distances listed above, the HDTVs appear "perfectly" sharp and at the visual acuity limit of your eyes. At the viewing distances listed above the pixels on a 1920x1080 HDTV will not be visible by a person with 20/20 Vision in exactly the same way as the Retina Displays on the iPhone 4, new iPad 3, and MacBook Pro at their viewing distances. So existing 1920x1080 HDTVs are "Retina Displays" in exactly the same way as the existing Apple Retina Display products. If the HDTVs had a higher PPI or a higher pixel resolution your eyes wouldn't be able to see the difference at their proper viewing distances. So existing 1920x1080 HDTVs are already equivalent to what Apple calls a "Retina Display." When Apple launches its own Apple Television it will almost certainly have a resolution of 1920x1080 and it will be a True Retina Display [for humans with 20/20 Vision at standard HDTV viewing distances]."

    www.displaymate.com/news.html#7

  6. #46
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    the problem is that what you view as acceptable and what i view as acceptable are not the same, you can argue this until you are blue in the face, but the only point where a screen will be statistically good enough that the resolution does not need to be increased is when it has a PPI of 267.3 or greater, resolution is pointless, and talking about what X resolution looks like at Y distance is just flooding the screen with irrelevant information and skirts around the issue

    for a screen to truly reach the limit, it needs to maintain a pixel size that is at the limit or exceeding the smallest size that can be detected in the eye, if you wanted to increase it just for good measure, use 534PPI or even 1068PPI (would probably be best)

    something to consider, sites often post stuff that isn't really true, often because they have a writer that just wants to get paid, and it's very rampant in any area that has some subjectivity to it, audio and video review sites are notorious for this, because its not exactly a hard science that can be disproved with testing

    so the best you can do for a screen is look at the smallest object an eye can see, which is 100 microns, and make a screen with pixels that size or smaller, and remember that when you make a bigger screen, you need to increase the number of pixels, otherwise you have to increase the size of the pixels

    so, in relation to the cnet article, 4k is not the end, it may be good enough for a 32" computer screen, but if you buy a 84" TV, im still gonna come over and tell you how the picture looks like crap compared to my monitor

  7. #47
    Quote Originally Posted by Cyanotical View Post
    a retina macbook is not close to 4k, 2560x1600 and the fun 1440x900 with double pixels are not really even that close:
    Its resolution is actually 2880x1800 (its 35% less then an official 4k resolution). And I said 'close to 4k' because it is the highest-resolution consumer laptop screen I am aware of and the only HiDPI display I was able to test out in person.

  8. #48
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by mafao View Post
    Its resolution is actually 2880x1800 (its 35% less then an official 4k resolution). And I said 'close to 4k' because it is the highest-resolution consumer laptop screen I am aware of and the only HiDPI display I was able to test out in person.
    2880x1800 is 1440x900 base with double pixels, apple has stopped doing this because its not a true 2880 resolution, the mid 2012 and newer retinas come with a true 2560x1600 display

    even so, 2880 is only 62.5% of the "hollywood" standard of 3840x2160 which is not true 4k

  9. #49
    Quote Originally Posted by Cyanotical View Post
    the problem is that what you view as acceptable and what i view as acceptable are not the same, you can argue this until you are blue in the face, but the only point where a screen will be statistically good enough that the resolution does not need to be increased is when it has a PPI of 267.3 or greater, resolution is pointless, and talking about what X resolution looks like at Y distance is just flooding the screen with irrelevant information and skirts around the issue

    for a screen to truly reach the limit, it needs to maintain a pixel size that is at the limit or exceeding the smallest size that can be detected in the eye, if you wanted to increase it just for good measure, use 534PPI or even 1068PPI (would probably be best)

    something to consider, sites often post stuff that isn't really true, often because they have a writer that just wants to get paid, and it's very rampant in any area that has some subjectivity to it, audio and video review sites are notorious for this, because its not exactly a hard science that can be disproved with testing

    so the best you can do for a screen is look at the smallest object an eye can see, which is 100 microns, and make a screen with pixels that size or smaller, and remember that when you make a bigger screen, you need to increase the number of pixels, otherwise you have to increase the size of the pixels

    so, in relation to the cnet article, 4k is not the end, it may be good enough for a 32" computer screen, but if you buy a 84" TV, im still gonna come over and tell you how the picture looks like crap compared to my monitor
    Fair enough, I always find it interesting to hear the other sides of these discussion. Some very interesting and valid points.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •