Thread: 1080 - 2k or 4k

Page 5 of 5 FirstFirst ...
3
4
5
  1. #81
    Quote Originally Posted by Temp name View Post
    I... Wat.
    wat.

    A 2080ti is perfectly fine for maxed out 1080p 144hz. As long as you don't want RTX on, at least. If you're using WoW as a benchmark, OF FUCKING COURSE you won't get to 144hz, because WoW is primarily single-threaded. If you have a 4c/4t CPU, and you stress only 1 of those cores, you'll get 25% usage. Most other games also favour one thread, but can divide the load more evenly.

    Saying that GPUs suck because your CPU is only at 30-50% load is laughable.



    Yeah, but it's a CPU bottle neck, not a GPU one. Wow runs super hard on 1 core, and just barely touches the rest. read above
    You can probably run WoW at 4k with a 970 just fine. I did with a 980ti getting 90+fps everywhere except raids
    Dude, dont bloody literate me on the basics. Buzz off. Watch youtube benchmarks done by the pro's and come back to me. I know wow, as most games, preffers single threaded performance, but my point still stands, that a 2080ti is not even enough for max details 1080p, especially if you actually use rtx on, which wow will support in SL apparently...Yes you can turn down settings and play wow on Intel HD, but whats the point? If I buy a new PC in 2020 I don't want to play the game like I do now...low/medium settings, often seeing 20-40 FPS in heated combat. Also recently wow did get multi-core support, but not even that did too much, so value a good GPU over anything else...Like litterally a 3600 and a 2080ti would be a fine match from my POV...CPU's have gotten so damn powerful.

    Edit: Cause I know you "pro's" on these forums think Im to ranty here ya go...bothered to google this pic which I saw floating around before https://imgur.com/a/UtTRS4d
    Last edited by Djuntas; 2020-06-06 at 07:55 PM.
    Youtube channel: https://www.youtube.com/c/djuntas ARPG - RTS - MMO

  2. #82
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Djuntas View Post
    Dude, dont bloody literate me on the basics. Buzz off. Watch youtube benchmarks done by the pro's and come back to me. I know wow, as most games, preffers single threaded performance, but my point still stands, that a 2080ti is not even enough for max details 1080p, especially if you actually use rtx on, which wow will support in SL apparently...Yes you can turn down settings and play wow on Intel HD, but whats the point? If I buy a new PC in 2020 I don't want to play the game like I do now...low/medium settings, often seeing 20-40 FPS in heated combat. Also recently wow did get multi-core support, but not even that did too much, so value a good GPU over anything else...Like litterally a 3600 and a 2080ti would be a fine match from my POV...CPU's have gotten so damn powerful.

    Edit: Cause I know you "pro's" on these forums think Im to ranty here ya go...bothered to google this pic which I saw floating around before https://imgur.com/a/UtTRS4d
    Yeah, a 3600 and a 2080ti is a fine pairing, hell you could probably get away with a 3300x and be fine. But that's not GPUs being lacking, that's games not leveraging CPUs properly. As for a 2080ti not being enough for 1080p? Dude..
    https://www.gamersnexus.net/guides/3...er-limitations
    https://www.pcgamer.com/nvidia-gefor...dition-review/
    https://www.guru3d.com/articles_page...review,13.html

    It's perfectly fine as long as you don't want RTX on. Ray tracing is just too hard to do real-time


    And you get about 50% of the performance on 1 core that you do on 4, which then gives the same performance as 8. Sounds a lot like it's heavily single-threaded and only lightly multi-threaded. Does it scale with more cores? Yeah, it scales okay up to 4, but then 0 scaling up to 8. Thank you for proving my point.

  3. #83
    Quote Originally Posted by Kagthul View Post
    Agreed, at the distance you're supposed to be sitting, anything in the 32" and lower size, you cant see individual pixels n 1440p or 4k, so the visual upgrade isnt nearly as noticeable.



    And? Unless you're trying to say that they were somehow shilling for nVidia, which is plainly absurd since they go over every detail of how they are performing the tests.



    Then you aren't talking about the same video i am, since the one im talking about, they clearly show what computer is what when each person is using it (the USERS didn't know).
    Sorry, but anyone who sits anytime before any pc knows if it is an potato or not. And especially the one in the middle was not only quite a bad graphic card, as far as i understand it, they also capped the framerate on the monitor. This is much worse than a graphic card that runs only on low fps, because both of them send some lags.

    So i don't think this is even in the ballpark of being realistic, because no one who is right in their mind cappes the framerate on the monitor too.

    But you are right: I rather have a higher framerate than a high resolution: but anything beyond 60 isn't really noticeable. I think that people are more reading into the fps meter rather than into the reality.

    I would also want one test: a test where there are 3 pcs, absolute identical. With activated fps meter. Only that 2 of the 3 are fake: one shows an higher, one a lower and one the real fps. And then we see if people notice a difference. And i think they will. Not all of them, but i think the majority.

    Because really: 144 vs 60: i rather have more details and lower fps.

  4. #84
    Quote Originally Posted by Velerios View Post
    I would also want one test: a test where there are 3 pcs, absolute identical. With activated fps meter. Only that 2 of the 3 are fake: one shows an higher, one a lower and one the real fps. And then we see if people notice a difference. And i think they will. Not all of them, but i think the majority.

    Because really: 144 vs 60: i rather have more details and lower fps.
    Sponsored reviews will just use some combinations of vsync / gsync / freesync, specific display modes and game settings with to high FPS that does visible tearing to force a difference. Because thats the only thing most of us is able to see. I think linus tech tips did a hilarious shitshow comparison with visible tearing at least they tag themselfs as comedy/satire channel.

    What does a higher refresh rate?
    A higher refresh rate helps to decrease the blur by giving our brains more information to act on, in turn reducing perceived blur. However, unlike computer hardware, our brains aren’t all made to the same specification. Some people notice the difference between a 60Hz and 120Hz display immediately, while others can’t see what everyone is all worked up about. The difference between 120Hz and 240Hz is even more subtle.

    What do you actually see?
    Because refresh rates and frame rates are very different things, they can often mismatch. That’s when something called screen tearing can occur. It tends to happen when a computer’s video card is spitting out frames at a rate well beyond the refresh rate of the monitor connected to it. Because more frames are being rendered than the monitor can handle, half-frames are sometimes shown together on the screen, manifesting as an obvious split between two portions of it, neither of which appears to line up correctly with the other. It’s a distracting problem that even the least sensitive viewer will usually notice.

    The truth is, that you need FPS capping with a 240Hz screen just as much as with a 60Hz screen or you will experience screen tearing. At some point the marketing bubble will burst.
    Last edited by Ange; 2020-06-07 at 09:48 AM.
    -

  5. #85
    Quote Originally Posted by Velerios View Post
    Sorry to ask, but who the hell use a 75" screen for the PC?
    I do, and I suggest everyone should, but no one believes me.

    Human eyesight evolved to see things at a 2m distance, not a 0.5m distance. If you look at things at a 0.5m distance all day, you're going to develop problems with eyesight. I had such problems and I solved them by mounting a big f-in TV about 2m away. As a result I don't need glasses at age 40, while other people do.

    But people don't believe me. They put on their glasses and look at their screens 0.5m away.
    Last edited by Elodeon; 2020-06-16 at 01:34 PM.

  6. #86
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Elodeon View Post
    I do, and I suggest everyone should, but no one believes me.

    Human eyesight evolved to see things at a 2m distance, not a 0.5m distance. If you look at things at a 0.5m distance all day, you're going to develop problems with eyesight. I had such problems and I solved them by mounting a big f-in TV about 2m away. As a result I don't need glasses at age 40, while other people do.

    But people don't believe me. They put on their glasses and look at their screens 0.5m away.
    If you use a 75" at 2m, you're going to have to move your head a lot. I have a ~50" at 3 meters for my console display, and it's a bit too big honestly

    Also, do you have any proof that looking at screens half a meter away degrades vision? Everything I've read is inconclusive. My parents both use glasses and they only watch TV from ~4 meters away, or laptops at ~2 meters. I don't use glasses and I use a monitor at ~arms length

  7. #87
    Quote Originally Posted by Elodeon View Post
    But people don't believe me.
    Since i have two relatives (a cousin and his wife) who are actual Opthalmalogists - you know, medical doctors - who dont agree with you, ill continue to believe the medical professionals who have trained specifically to deal with eyeballs, and not some random dude on the interwebs.

    However, half a meter (1.5 feet) seems a little close. I sit about... 25-28" from my monitors (depending on how far im leaning back in my chair), and thats roughly the depth of most desks (about 24-30"), and people dont tend to set the monitor in the middle of the desk.

    if i moved in to 1.5ft, id be able to see pixels pretty easily even on a 25" 1440p monitor

    - - - Updated - - -

    Quote Originally Posted by Velerios View Post
    Sorry, but anyone who sits anytime before any pc knows if it is an potato or not. And especially the one in the middle was not only quite a bad graphic card, as far as i understand it, they also capped the framerate on the monitor. This is much worse than a graphic card that runs only on low fps, because both of them send some lags.

    So i don't think this is even in the ballpark of being realistic, because no one who is right in their mind cappes the framerate on the monitor too.

    But you are right: I rather have a higher framerate than a high resolution: but anything beyond 60 isn't really noticeable. I think that people are more reading into the fps meter rather than into the reality.

    I would also want one test: a test where there are 3 pcs, absolute identical. With activated fps meter. Only that 2 of the 3 are fake: one shows an higher, one a lower and one the real fps. And then we see if people notice a difference. And i think they will. Not all of them, but i think the majority.

    Because really: 144 vs 60: i rather have more details and lower fps.
    so your entire argument is "i disagree with professionals and scientific testing that has been done ad-nauseum because it doesnt' conform to my uneducated opinin".

    Got it.

  8. #88
    Quote Originally Posted by draugrbane View Post
    I've recently updated my rig and I bought a Ryzen 3900X paired with a 2070 Super. I've a 144hz 1080 curved monitor and I was wondering if its worth it to buy a 2k or 4k monitor to play almost only World of Warcraft? What do you think guys?
    With a 2070 super you can go 1440p without breaking a sweat. Do that, imo.

    Quote Originally Posted by Elodeon View Post
    I do, and I suggest everyone should, but no one believes me.

    Human eyesight evolved to see things at a 2m distance, not a 0.5m distance. If you look at things at a 0.5m distance all day, you're going to develop problems with eyesight. I had such problems and I solved them by mounting a big f-in TV about 2m away. As a result I don't need glasses at age 40, while other people do.

    But people don't believe me. They put on their glasses and look at their screens 0.5m away.
    Please, don't play doctor online. It's a. unethical and b. potentially dangerous. Heck, even real physicians generally don't give medical advice online.
    success comes in the form of technical solutions to problems, not appeals to our emotional side

  9. #89
    I play QHD (Quad HD) or 1440p on 144 hz with 1070 TI.

    However, of course I do not have 144 ips all the time ^^ But usually between 100 and 140.

  10. #90
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Bennett View Post
    You can tell the difference - after you have played 4k - I'd say if you have never played 4k, you're not missing enough to justify the massive price increase
    I have played 4k, I went down to 1440p and didn't notice the difference in fidelity.

    It all depends on how close you're sitting to your monitor and how big it is.

  11. #91
    Quote Originally Posted by Elodeon View Post
    I do, and I suggest everyone should, but no one believes me.

    Human eyesight evolved to see things at a 2m distance, not a 0.5m distance. If you look at things at a 0.5m distance all day, you're going to develop problems with eyesight. I had such problems and I solved them by mounting a big f-in TV about 2m away. As a result I don't need glasses at age 40, while other people do.

    But people don't believe me. They put on their glasses and look at their screens 0.5m away.
    6 feet for a 75" screen?

    Yeah you're going to need glasses lol

  12. #92
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Bennett View Post
    Aye a lot of people say that - I'm currently using a 4k and planning on getting a 1440 instead because frankly while I could maybe find the money for a 4k system, I don't think the extra cost is justified at current prices (just my opinion) - however, going back to 1080 is very noticeable, and I'm sure it's a matter of adjustment but I kind of wish I'd never played 4k to begin with
    Oh yeah, I'd not go back to 1080p unless I got a small screen, like 21" or smaller. 1440p for me is perfect. Enough pixel density that I don't see the individual pixels, but coarse enough that it's not super demanding to run.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •