Thread: 1080 - 2k or 4k

Page 3 of 5 FirstFirst
1
2
3
4
5
LastLast
  1. #41
    Quote Originally Posted by Kagthul View Post
    That would make it equal to 1080p
    No, 1080p is 1/4 the size of 4k. 1440p is about half of 4K.

  2. #42
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Jonnusthegreat View Post
    No, 1080p is 1/4 the size of 4k. 1440p is about half of 4K.
    Depends on how it's implemented. Some games take resolution scaling to mean 50% on one axis, so you'd end up with only 25% of the total resolution, and other games take 50% to mean in total.

  3. #43
    Quote Originally Posted by Temp name View Post
    Depends on how it's implemented. Some games take resolution scaling to mean 50% on one axis, so you'd end up with only 25% of the total resolution, and other games take 50% to mean in total.
    You are right, although most games indeed take the resolution number of one axis. After all, we speak of resolutions in x*x, not in "total area of pixels" as in photography. That usually means that "4k" at 50% = 3940x2160/2 = 1920x1080

    All this confusion really is up to the marketing of the "4k", as many people said usually "xxx"p means the resolution on the height axis, 1080p = 1920x1080, 4k though takes it's name from the lenght axis, 3940x2160. That's where loads of folks get lost, cus in marketing saying "quadruple resolution!" is flashier than "double resolution". And yes, I know that they get behind the "total pixel area" of the screen and not just the resolution axis - it's still misleading as we ALWAYS spoke of axis x axis resolution in gaming.

    If we took the same logic and applied it to 1920x1080, it would indeed be marketed as "2k", as it'd take the 1920 axis and upsell it a little to that 2k term. Just as 3940 is not 4,000 :P

  4. #44
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by illander View Post
    You are right, although most games indeed take the resolution number of one axis. After all, we speak of resolutions in x*x, not in "total area of pixels" as in photography. That usually means that "4k" at 50% = 3940x2160/2 = 1920x1080

    All this confusion really is up to the marketing of the "4k", as many people said usually "xxx"p means the resolution on the height axis, 1080p = 1920x1080, 4k though takes it's name from the lenght axis, 3940x2160. That's where loads of folks get lost, cus in marketing saying "quadruple resolution!" is flashier than "double resolution". And yes, I know that they get behind the "total pixel area" of the screen and not just the resolution axis - it's still misleading as we ALWAYS spoke of axis x axis resolution in gaming.

    If we took the same logic and applied it to 1920x1080, it would indeed be marketed as "2k", as it'd take the 1920 axis and upsell it a little to that 2k term. Just as 3940 is not 4,000 :P
    Yeah, indeed. I'm honestly not sure why they were able to market 3840x2160 as 4k, but it was attached at the start, so it caught on instead of UHD or 2160p. One of the problems with using 2k nowadays is that it's so late in the product cycle, so people use it about 1440p which is newer, but it just makes no sense to call 2560 2k. It should be called either 2.5k, 2.6k, or 3k.

    Also, for some reason pretty much everyone skipped 3200x1800, which is just kinda sad I think.

  5. #45
    Next gen of GFX cards should make 4k gaming the standard and start pushing into 8k. I'd wait for that.

  6. #46
    Remember them glorious 1366x768 days...?

  7. #47
    Quote Originally Posted by Dald View Post
    Next gen of GFX cards should make 4k gaming the standard and start pushing into 8k. I'd wait for that.
    Uhh.. no. Thats not how this works at all.

    For 4k to be "standard", that means that greater than 50% of users have to have rigs capable of 4k/high settings/reasonable refresh rate.

    Right now, the number of people that can do that is like.. 5%.

    The average GPU out there is a GTX 1060 or RX 570 or so.

    Something like ~80% of people (or more, i confess to not having actually perused the entirety of the Hardware Survey of late) still rock 1080p screens.

    4k is still ~5+ years away as remotely "mainstream". For it to be mainstream, its going to have be something a sub-200$ GPU that can be added to OEM computers can do reliably.

    We're not close to that yet.

    RTX 3000 will likely have a "3070" that can handle 4k/60 at a mix of high settings and some medium. The 3080 will probably be able to handle 4k/60 and all high settings. The 3080Ti MIGHT be able to do 4k at high refresh (90+ fps, but i wou ldn't expect over ~110) if they leave it close to the full fat A100 die.

    For reference, the number of people who buy cards in the "XX70"+ range is like... sub 15%. For every 3080Ti that goes out the door, there will be 50+ 3060s and even more 3050s.
    Last edited by Kagthul; 2020-06-04 at 08:24 AM.

  8. #48
    I support the 1440p@144hz way. The increase in resolution from 1080p is really noticeable, much more than going from 1440p to 4K. But it's not as taxing on your GPU so you can go over the 60fps mark and playing 100+ is just awesome. You can easily turn off any AA active and enjoy a crisp image.

    I have a 1080ti (which has likely issues, sometimes even in wow i am "capped" at 30fps and pc sometimes if i turn off the PSU doesn't boot with the VGA led on the mobo) and it's the best way i use it fully. Likely going to replace it with a 3000 something super when they come out. EDIT: probably going to buy a FE card since they tend to cost a little less and plan to do a custom loop, so no need to spend more on better cooling that 3rd party cards provide.
    Last edited by Coldkil; 2020-06-04 at 09:58 AM.
    Non ti fidar di me se il cuor ti manca.

  9. #49
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Dald View Post
    Next gen of GFX cards should make 4k gaming the standard and start pushing into 8k. I'd wait for that.
    Let's just magically say that they make 4k standard (it won't, see Kagthul's reply).. 8k is 4x as hard as 4k. That means that if you can do 4k60fps at all high, you can do 8k15fps at all high. And that's only if you have an 8k capable display, which practically no one has.

    And as for "start pushing into 8k" I mean, you can just slap a 750ti into a gaming box, connect it to an 8k TV, and there you go, you're now playing at 8k on a 4 generations old card

    - - - Updated - - -

    Quote Originally Posted by Coldkil View Post
    EDIT: probably going to buy a FE card since they tend to cost a little less and plan to do a custom loop, so no need to spend more on better cooling that 3rd party cards provide.
    Just worth pointing out that some 3rd party cards have different PCBs, which make them better (or worse) overclockers, so you don't always buy a 3rd party card for the better cooling

  10. #50
    Quote Originally Posted by Temp name View Post
    Just worth pointing out that some 3rd party cards have different PCBs, which make them better (or worse) overclockers, so you don't always buy a 3rd party card for the better cooling
    Yeah, didn't consider that. Anyway, it's still going to be a wait until my GPU fries or i find it's something else going bonkers.
    Non ti fidar di me se il cuor ti manca.

  11. #51
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Temp name View Post
    Let's just magically say that they make 4k standard (it won't, see Kagthul's reply).. 8k is 4x as hard as 4k. That means that if you can do 4k60fps at all high, you can do 8k15fps at all high. And that's only if you have an 8k capable display, which practically no one has.
    Not to mention the ungodly amount of VRAM needed to play at 8k. 4k already basically requires 8GB minimum.

  12. #52
    Hmm this thread has gone really far away to what the OP has asked for.

    High refresh rates has stronger diminishing returns for WoW, big screen high resolution displays work so much better for this game.

    What would you notice more when a new expansion drops, that big screen immersion with detail or high frame rates, WoW has wonderfully designed zones and you take it all in more when new zones and content drops.

    And when you're not doing stuff, your AFK, high refresh rates do nothing when you are AFK or at the auction house.

  13. #53
    I personally can absolutely tell a difference between 4k and 1440p. The overall picture fidelity is roughly the same at a glance, but reducing resolution down to 1440p causing some things in shading to look very strange on my screen. There is a massive difference between 4k and 1080p, though.

    That said, you'll be hard pressed to really be able to push modern games at 4k with a 2070 super. Hell, my 2080 super is pushing it for some games (most recently, it's struggled with AC Odyssey, but that's most likely because Ubisoft games are very poorly optimized). And the overall gaming experience will be superior at 1440p at high refresh rates. There is absolutely no hardware configuration that can consistently run games at 4k at framerates that take advantage of 120+Hz refresh rates, so don't waste your money on one of those super high end displays.

    And since you're primarily concerned with World of Warcraft, I'd honestly say 1440p is more than enough. It's not a high fidelity game so it doesn't get TOO much out of high resolutions. Yeah, your picture will be more crisp, but it will still look like a 16 year old game.

  14. #54
    Not worth for me. I'm ok playing a 1080p and probably will do as long as the resolution is supported. No need to buy more expensive hardware to push more pixels, specially not at the cost of framerates. Higher FPS over resolution any day.

  15. #55
    My advise? pick a good 4k monitor with the ability of working at 120/144hz in 1080p, why? there is a good chance you need to lower resolution on some poorly optimized game without suffering from weird crops/black lines,etc that 1440p will give. so you get the best of both worlds, a 4k monitor for beautiful aesthetics and games that work well in 4k and a good 1080p experience for unoptimized games or e-sports games where a higher refresh rate is preferable

  16. #56
    Quote Originally Posted by azkhane View Post
    there is a good chance you need to lower resolution on some poorly optimized game without suffering from weird crops/black lines,etc that 1440p will give.
    ... wot m8?

    1440p is a 16:9 resolution, just like 720p, 1080p, and 2160p.

    There will be zero crops or black lines.

    - - - Updated - - -

    Quote Originally Posted by Neuroticaine View Post
    I personally can absolutely tell a difference between 4k and 1440p. The overall picture fidelity is roughly the same at a glance, but reducing resolution down to 1440p causing some things in shading to look very strange on my screen. There is a massive difference between 4k and 1080p, though.
    Because you aren't actually running at 1440p in that case. You're running on a 4k monitor and downscaling to a non-native resolution.

  17. #57
    Quote Originally Posted by Doctor Amadeus View Post
    Good question I’m wondering if 4K is worth it. Can you really tell the difference?
    For a bigger screen, yes. Though at that point it might as well be a large TV and not a Monitor.

    I wouldn't go with anything bigger than 32" for PCs.

  18. #58
    The Lightbringer msdos's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    Las Vegas
    Posts
    3,040
    I personally would wait for next gen cards if I was stepping into 4k territory because that territory is about to be annexed by RTX and DLSS, once that happens you will need more hardware anyway.

    I think 144hz is the meme. I used to have a 144hz monitor, but it was a TN panel and when you went up in hz you could see a drop in picture quality. The 144hz game is a money sink if you play into it also. Games already looks crisp and beautiful at a constant 60 fps, key word there being constant. A constant 144hz is just like having a hair dryer blowing on your parts all the time, just ramping up the heat with 144 fps game menus and loading screens and for what, more dynamic headshots?? Could you not 360 no scope at 60 fps? I'm confused about the actual applications of 144hz, because back in my CS days we really only desired 100 constant fps. Graphics cards used to explode back in those days also.

  19. #59
    I don't even know if gaming at 8k would even be that pleasant, at least for me, and at the very least on a PC. In order to make the most of that resolution, i would need a pretty large monitor. Idk, i own a 2560x1080p monitor(weird resolution i know) that is 27 inches wide but that's perfect for me. Once the monitor gets past like 35 inches wide it just becomes annoying for me.

  20. #60
    4K spoils you so that you no longer want to work on less than 4K. At least in my case - I use a 75" screen, so the difference is very visible.

    4K has four times as many pixels as 2K, and requires appropriately more graphics power. The same graphics card that can handle a high frame rate on 2K may struggle with 4K, so you may need stronger graphics to deliver the same frame rate.

    If you can afford it, go for it.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •