Page 5 of 5 FirstFirst ...
3
4
5
  1. #81
    Deleted
    Quote Originally Posted by Drunkenvalley View Post
    Exactly how will 3x 1080p monitors be harder to drive than 4x 1080p monitors, which is what 4k is...? And nobody is going to bother with your point if you're not going to tell us, you know.
    4K = One monitor at 3840 × 2160, not 4x...
    Surround with 3x 1080 = 3x Monitors with 1920*1080 (and bezel compensation) giving you (and interpreted by the system) as one monitor with 5860*1080.

    When it comes to usage and bandwidth:
    The 4K Monitor: http://web.forret.com/tools/video_fp...gb444&depth=12
    Due to the fact that it has an area of 8,294,400 pixles which demands 60 updates per second

    The Surround setup: http://web.forret.com/tools/video_fp...gb444&depth=12
    Due to the fact that it has an area of 6,328,800 pixels which demands 144 updates per second

    This was my setup (with 144hz screens, obviously with 60hz it's going to be another deal), but 'some' people seem to completely ignore this fact and therefor jump to the conclusion that it is impossible.

    No, having a much higher hz will be much harder to sustain.
    If the system is able to sustain this 'roof' without any problems with current hardware, obviously reducing the update frequency to 60 is not going to be harder, it will simply allow everything that was above 60 hz to be "margins". Locking the FPS to 60 is much more realistic than lock and hold it at 144.

    It is true that a single 4K has a wider area, but as the frequency is low in comparison, it is less tedious for it to render this area with the set hz and hold that 60 frame rate. (Math has been posted several times and now in this post as well, width times height times updates to get how many pixels per second the card has to render during 1 second, where 144 hz at lower area is waaaay above 4k at 60hz for obvious reasons)

    And we haven't even discussed the details of the field of view. This isn't simply 1920x1080 being stretched out to 5860x1080. The Field of View (which is locked by the engine in WoW), will allow you to see (with some fish eye effect) much more than 1 screen. Stading at Lumber Mill in AB will allow you to see both the Alliance and Horde starting areas on the outer monitors and everything in between (zoomed in to first person) compared to 1 screen where you just see Blacksmith.

    In games where you can adjust the FoV properly to your sitting position and setup (2*atan(0.5*base/viewing distance)), you will get the perfect field of view for your setup without the fish eye or tunnel effect. This way you see a lot more in game (and a lot more is being rendered by the computer), which is why this setup is banned in e-sports and can be considered cheating.

    The reason why no 4K monitor today has any update frequency above 60 hz is due to bandwidth limitation of the current interfaces (which is why some will come with thunderbolt or updated verison of current standards). Example today; The Eizo Foris FG2421 240hz (120hz with Black Screen Insertion (BSI)) can only be used through dual-link DVI at native resolution because the other interfaces can't handle it with their limited bandwidth.

    A setup capable of running this kind of stuff will be more than sufficient to run a single 4K monitor at measly 60 hz. (I have tried both setups, and they both work equally...)

    The only reason a 4K would perform worse (it does sometimes) is simply due to immature driver support and/or user error, nothing else. I couldn't even get the 4K monitor to function properly at day 1. This was also the case for Eyefinity when it just released.

    There was a similar discussion that Eyefinity/Surround was not capable of holding a good frame rate in Battlefield 4 when it was in beta or just released, which was true (and people seemed to think that it was always going to be this way because BF4 was so heavy, right?) until Nvidia released their Beta driver (331.65) which pretty much optimized the game so you could constantly hold your vsync FPS even in crazy surround/eyefinity setups.

    Today I play BF4 with everyting on Ultra (apart from AA, I use FXAA through the driver) and dip on bad maps to 70 fps, but averages at 144 fps on the rest of the maps. 6 months ago this was apparently "deemed impossible", same way some think that 4K is "heavier" than eyefinity/surround at higher hertz, and that the current hardware isn't capable to deliver a fluid experience (it is, if you pay for it), even when the math is not on their side.

    TLDR;
    Todays high-end hardware is capable at running 4k fine. But you would still need to upgrade, especially if you expand your setup (multi-monitor)
    Todays 4K support and optimization is horrible.
    Last edited by mmoc3f6ff16fa0; 2014-07-15 at 01:08 PM.

  2. #82
    Deleted
    Quote Originally Posted by Wazzbo View Post
    When it comes to usage and bandwidth:
    The 4K Monitor: http://web.forret.com/tools/video_fp...gb444&depth=12
    Due to the fact that it has an area of 8,294,400 pixles which demands 60 updates per second

    The Surround setup: http://web.forret.com/tools/video_fp...gb444&depth=12
    Due to the fact that it has an area of 6,328,800 pixels which demands 144 updates per second
    Too Long, Did Read:

    You have some good points on the immersive experience, but that is not on topic on what we were discussing here.

    You fail to understand that you're comparing apples to oranges here. The point is not that your displays are 144Hz, which is cool, and I bet it's awesome to game on. The point is that a regular 60Hz 1080p surround setup is NOT harder to drive than a 60Hz 4K panel, because it is only 3x 1080, and not the 4x 1080 4K offers.

    I understand you're trying to compare this to your own setup, which is fine, but don't call me a retard for no reason. This board is not to call people retards or to vent your anger. Keep your posts civil and people might try to reason with you.

    (This entire post is assuming you are pushing 144 FPS on the games you play, otherwise, your entire bandwidth arguement for your monitors is completely void.)

    I hope you can stop derailing this thread now, you've made your claims and others have made theirs. This has been going on for way too long.

    Kind regards.

  3. #83
    Deleted
    Quote Originally Posted by Prixie View Post
    Too Long, Did Read:

    You have some good points on the immersive experience, but that is not on topic on what we were discussing here.

    You fail to understand that you're comparing apples to oranges here. The point is not that your displays are 144Hz, which is cool, and I bet it's awesome to game on. The point is that a regular 60Hz 1080p surround setup is NOT harder to drive than a 60Hz 4K panel, because it is only 3x 1080, and not the 4x 1080 4K offers.

    I understand you're trying to compare this to your own setup, which is fine, but don't call me a retard for no reason. This board is not to call people retards or to vent your anger. Keep your posts civil and people might try to reason with you.

    (This entire post is assuming you are pushing 144 FPS on the games you play, otherwise, your entire bandwidth arguement for your monitors is completely void.)

    I hope you can stop derailing this thread now, you've made your claims and others have made theirs. This has been going on for way too long.

    Kind regards.
    There never was a single 1080p screen vs 4K discussion to begin with.

    It started with people stating that a 4K was heavier load than a surround setup (my surround setup, because I compared the load on my computer with the load from a single 4K on the same computer, as I've tested both), to conclude that IF your setup is capable at running 120 or 144hz in surround, it can easily manage a single 4K screen. But people felt like objecting this and derailing it (to which I responded).

    I want to increase my setup to 5x screens in Portrait mode, which will REQUIRE an upgrade, as will newer games with more sophisticated graphics because I did mention that my FPS did drop, even if it's mostly stable. Which was the original topic...

    SO YES, Having a 4K capable computer will still require an upgrade when more games featuring better graphics come, or when you want to expand your setup (say, triple 4K monitors)

    If you did follow all post you would understand why this was even brought up, and why the 144hz point is crucial (which you somehow managed to miss, so it looked like you simply read one post and spewed something).

    Fun fact is that our testing rig (which consists of 3x 2560x1080 @ 60hz) has the exact same load in terms of pixel throughput compared to a single 4K monitor, so high hz isn't even necessary to bypass the load from a single 4K if you run a high-end surround setup.

  4. #84
    Deleted
    Oh I didn't miss that point. What you did seem to miss is the point Cyanotical brought up about texture resolution and vram usage compared to surround.
    But hey, maybe you didn't miss that either.

    Feel free to discuss this in PM, I'm done flooding this thread with a conversation of stubbornness.

  5. #85
    To answer your question with style:

    Ever from the 1950's computer technology have advanced by becoming almost/if not twice as strong/advanced every 18 months.
    (Look it up if you dont believe me).

    There is probably technology availible now wich is better then 5 Ghz 8-core CPU's and 4k ready GPU's. It might not be on the market yet but I can assure you with the competition thats out there it will come out soon.

    In 2011 bought a quad core 3 Ghz cpu wich was the shit back then. Now I bought a 4.0 Ghz 8 core cpu for like $120. (In Norway this really isnt alot).
    Just wait until 2017/18 when the next big thing will be out on the market. Cant wait to see what thats gonna be!
    In 2018 we will probably have 4k on our smartphones.. Without a doubt.

  6. #86
    Deleted
    Quote Originally Posted by Prixie View Post
    Oh I didn't miss that point. What you did seem to miss is the point Cyanotical brought up about texture resolution and vram usage compared to surround.
    But hey, maybe you didn't miss that either.

    Feel free to discuss this in PM, I'm done flooding this thread with a conversation of stubbornness.
    I wouldn't be, what you call "stubborn" if you brought anything of relevance to this topic.

    You obviously haven't read as I did mention vRAM usage and if anything it was lower on 4K. Probably due to the fact that there are less stuff visible on the screen to render.

    VRAM hasn't been an issue ever since surround and eyefinity and since cards received 3GB+... and now that 4K uses less, how would it ever become a problem?

    Not to mention, if you had any idea what vRAM is actually used for and what happens if you run out, you would realize that this isn't a problem to begin with.... (Google 'memory thrashing', 'vram thrashing' or 'vram explained').

    I assume you understand that texture resolution is unaffected by screen resolution, only how many textures are visible at the given time.
    Cyanotical brought up texture resolution like it has anything to do with the screen size, which just futher proves that he doesn't have a clue what hes talking about, and couldn't give any further details when asked.

    The only thing that could be of relevance would be the rendering resolution, which you can't manipulate in WoW, but you can in BF4. It would still be more load on surround with a higher than 100% rendering resolution because surround has a higher native resolution....

    I did miss the point intentionally because not only is it irrelevant, it is also wrong.
    Last edited by mmoc3f6ff16fa0; 2014-07-15 at 08:22 PM.

  7. #87
    The Unstoppable Force Ghostpanther's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    USA, Ohio
    Posts
    24,112
    There has never been a computer designed which eventually never failed to perform well enough with any new technology. The one you buy today eventually will fail to play something in a manner you will be happy with. However no computer is ever outdated as long as it does what you want it to do with nice performance.

  8. #88
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by Gawdlock View Post
    To answer your question with style:

    Ever from the 1950's computer technology have advanced by becoming almost/if not twice as strong/advanced every 18 months.
    (Look it up if you dont believe me).
    .
    moore's law is kind of what you are referring to, and its not a standard or law, it was an observation that intel bases its development and release cycles on, nothing further, (gordon moore was one of the founders of intel)

    Quote Originally Posted by Wazzbo View Post
    he doesn't have a clue what hes talking about,
    guess i need to get a blue dog

    the thing is that when you run surround, you are running a wider 1080 display, so your texture and multisampling demand isn't as high, you could have a res of 10000000000000000x1080 and you's still have the load of 1080 textures, maybe a little more for extra textures in the side screens

    a 4k screen ups the vert, and so things like textures, multisampling, and other rendering techniques based on vert get much more demanding, its not as simple as just counting pixels, that only gives you a base number, but its only a single part of the equation

    its true that the wider FOV of surround puts extra load on your GPUs, but they are converting a single wide virtual resolution given to the game engine into 3 distinct 1920x1080 renderings, 1 per screen, and usually a 2+dp on stream controllers (fyi, you need DP because DVI is limited to two screens per stream controller, hdmi uses DVI so it has the same limitation)

    as for the bits per second needed, idk why you are looking at the max bandwidth your screens can consume, thats just wrong, anytime you are dealing with a transmission medium you look at bandwidth vs throughput, ie, a 100Mbps ISP connection won't let you browse reddit faster than a 5Mbps line, by your logic of looking at the max potential, because i can browse with only 5Mb, my connection can clearly handle a 100Mb load

    just because you have the power to drive three 1920x1080 screens at 144Hz, does not mean you have the power to drive a 4k screen, let alone run a game at 4k, textures and post processing included

  9. #89
    I'm gonna go ahead and close this thread. At this point it's mostly bickering, semi/full blown flaming, and only still vaguely on topic

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •