1. #5101
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    You'd think people would be more enthousiastic about a solution to all the stuttering and tearing. But people seem very suspicious and reserved (discontent, rather) about the g-sync technology. Especially on the dutch webpages.

    I for one, would buy this in a heartbeat (no i'm not gullible, just excited).
    Last edited by Majesticii; 2013-10-19 at 10:25 AM.

  2. #5102
    Possibly because they don't want to spend another 150-500 Euro on a new monitor, when their current one works perfectly fine?

  3. #5103
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    That doesn't make sense. If you dont want to spend money on a new monitor, then don't, why bother to discredit the technology. It's a problem that has been around since...ever. Countless posts about microstutter and tearing later, and now with the first appearance of a solution they try to bury it like the plague. It just seems really weird.

  4. #5104
    Tearing and microstutter is not really a monitor issue though. Lazy and uninteresting solution, yet another "buy-a-new-monitor"-technique from nVidia. Those are two reasons why people are sceptical.
     

  5. #5105
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Getting a game to run at exactly 60fps is an easy task you say? I would much rather have developers focus on fidelity, than sacrificing it to comply with a standard that has caused issues from the first day; static 60hz refreshrate.
    Last edited by Majesticii; 2013-10-19 at 06:02 PM.

  6. #5106
    Well with gsync you aren't going to fix microstutter issues since microstutter can be present when you don't even have vsync turned on. The time that your gpu needs to render a frame, Gsync isnt going to help really.

  7. #5107
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by Faithh View Post
    Well with gsync you aren't going to fix microstutter issues since microstutter can be present when you don't even have vsync turned on. The time that your gpu needs to render a frame, Gsync isnt going to help really.
    I think it does. It caculates the refreshrate on each drawcycle. Meaning the monitor displays only the drawn frames, not having sudden duplicates due to the buffer not being ready for a new scan.

    Stutter basicly being a visual representation of the monitor scanning the same image in the buffer due to the videocard not having enough time to update it.

    This static refreshrate of monitors is so normal to us, i dont think anyone realises how much issues it causes.
    Last edited by Majesticii; 2013-10-19 at 06:18 PM.

  8. #5108
    Quote Originally Posted by Majesticii View Post
    I think it does. It caculates the refreshrate on each drawcycle. Meaning the monitor displays only the drawn frames, not having sudden duplicates due to the buffer not being ready for a new scan.

    Stutter basicly being a visual representation of the monitor scanning the same image in the buffer due to the videocard not having enough time to update it.

    This static refreshrate of monitors is so normal to us, i dont think anyone realises how much issues it causes.
    How I'm thinking is:

    - Stutter caused by Vsync
    - "Normal" stutter or rather the latency how long a gpu took to render a single frame which you even have with vsync disabled

    With Gsync theyre only fixing the stutter thats caused by vsync.

  9. #5109
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by Faithh View Post
    How I'm thinking is:

    - Stutter caused by Vsync
    - "Normal" stutter or rather the latency how long a gpu took to render a single frame which you even have with vsync disabled

    With Gsync theyre only fixing the stutter thats caused by vsync.
    It's very simple, your GPU draws an image at a certain interval which can very (depending on complexity). It stores this image in the imagebuffer when it's completed. Your screen scans this image and thus displays it. What vsync does it make sure that the imagebuffer does not get overwritten when the screen is still busy scanning (16.67ms time duration on a 60hz screen). Which means that whilst the monitor was reading the image, it got overwritten and suddenly starts scanning a different image causing the warped images (or "tearing"). Vsync locks the GPU to only overwrite the buffer on the same intervaltime as the monitor, causing perhipheral input lag (mouse, keyboard) if there is a significant overhead.

    Stutter is basicly the GPU being slower with updating the buffer that the scanningtime of the monitor. So if the GPU takes 25ms to update the buffer, and the scanning frequency of the monitor is 16.67hz, the monitor will scan the same image twice, or three times. Which you can visuall see as the display "hanging", which we call stuttering. But there are no two kinds of stuttering, because there is only one condition in which this occurs.

    HOWEVER, traditional vsync has 2 modes. 30 and 60fps. When the GPU is rendering at 60 or more fps, it locks at 60fps. When the gpu renders at <60fps it swaps to 30fps mode. This transition can also be seen as a stutter, because you have a moment of the screen using the same image a few times. This issue has been somewhat resolved by nvidia's 'adaptive vsync'.

    If you can actively say that the monitor only refreshes when the buffer is written, this is represented as a much more smooth picture. Because there is no tearing or double images. Only an varying imagestream, which to our eyes is perceived as a smooth image.
    Last edited by Majesticii; 2013-10-19 at 06:43 PM.

  10. #5110
    The monitor is synced to the GPU's framebuffer, this does not have anything to do with the slow rendering time of your GPU. It's called for a reason sync, you can't say that such a feature is going to reduce or remove the frame rendering latency. Frame latencies we can see from reviews is just the time that the GPU needs to put this into the framebuffer which is an earlier step. That only can be solved by using a better architecture or driver optimizations.

    If this is supposed to fix all the microstutter issues everyone would have gone mad already instead of seeing this partly as a gimmick. The way nvidia explained "stutter" is really confusing for a lot of people, it's not going to take the crossfire stuttering issues away if those cards were even supported with gsync monitors.

  11. #5111
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by Faithh View Post
    The monitor is synced to the GPU's framebuffer, this does not have anything to do with the slow rendering time of your GPU. It's called for a reason sync, you can't say that such a feature is going to reduce or remove the frame rendering latency.
    It's not synced to the buffer perse. The buffer is just memory on the graphics card from which your monitor scans/reads an image. If it was synced all the time, you wouldn't have tearing. The only thing you can do is make sure the framebuffer and the monitor share the same timings (vsync). It has everything to do with the GPU rendering time. If the GPU takes longer than the scantime of your monitor, it will not have refreshed the buffer in time. Meaning the monitor will scan the last completed draw. Having a frametimesequence of 16 then 22ms then 18ms is not what you perceive as stutter. It's the visual representation/limitation on the screen which causes this (double imagescans). 16 would mean on time, the 22ms image will be shown twice etc.

    Quote Originally Posted by Faithh View Post
    Frame latencies we can see from reviews is just the time that the GPU needs to put this into the framebuffer which is an earlier step. That only can be solved by using a better architecture or driver optimizations.
    Which is significantly harder, and always results in a lower fidelity. Due to programmers having to compromise visuals for performance to reach the target.

    Quote Originally Posted by Faithh View Post
    If this is supposed to fix all the microstutter issues everyone would have gone mad already instead of seeing this partly as a gimmick. The way nvidia explained "stutter" is really confusing for a lot of people, it's not going to take the crossfire stuttering issues away if those cards were even supported with gsync monitors.
    No it's not confusing, they actually explain it really well. It's the fact that people have misconceptions about the subject which causes the issue. Crossfire stutter is basicly having 16 then 22 then 16 then 22ms render times (for example). Resulting in lots of double scans. Though this being admittedly a driver issue, it would be unnoticeable if refreshrates also adapted accordingly.

    _____________________________<--a bottom line

    I'm not promoting G-Sync here, just excited about the technology. It has taken too long already, resulting in the proprietary module G-Sync.

    - - - Updated - - -

    However, i must add. It is interesting to see how people get hyped over a proprietary API (Mantle) that, if succesful (though i doubt it since microsoft showed no interest), could have a direct influence on how gaming is coded. But get butthurt when nvidia opts for a proprietary displayfeature that would have no direct influence, instead of creating their own API. And in no way would force you use it. It's fairly mindboggling and reeks of bias.

    Let's just say, it's easier to argue about the implications a multitude of API standards would have...
    Last edited by Majesticii; 2013-10-19 at 09:09 PM.

  12. #5112
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,095
    Quote Originally Posted by tetrisGOAT View Post
    I have a bicycle. I'm a total badass.
    Unfortunately, America isn't very bicycle friendly in all parts. To get to work, there are a lot of roads with no sidewalks/bike lanes, thus I'd have to ride along the edge of 50mph zones with TONS of traffic. Fuck that! I'm not the daredevil I once was. ALSO, When half of my shifts don't end until 2:30-3AM... do you really think I want to go riding my bike? >_< Plus, it's been getting real cold lately. Already takes me half an hour to drive home even going 80mph along the freeway, so... imagine how long it would take to get home on a bicycle. x_x 15 miles home to work, ugh. Traffic. Blah.
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  13. #5113
    Quote Originally Posted by Majesticii View Post
    *
    http://i.imgur.com/ICUKRXA.png

    Are you going to fix that latency with Gsync? Obviously there's no vsync there so are you still going to blame vsync? You might have higher gpu rendering time if vsync is enabled that wasn't even my point. My point was that you can't take stuttering totally away by adding a pcb to an output device, doesn't make sense like this.

    Taking that graph I just linked, gsync isn't going to eliminate the high rendering times.

  14. #5114
    It is interesting to see how people get hyped over a proprietary API (Mantle) that, if succesful (though i doubt it since microsoft showed no interest), could have a direct influence on how gaming is coded.
    Microsoft doesn't need to be interested in Mantle for it to work. There will be a low-level API for the XBox no matter what it is called, that API is designed for GCN GPUs.

  15. #5115
    Microsoft shunning Mantle is less surprising considering their investment in DirectX and wanting to be the superior gaming-platform for PC.
    Their belief in it aside, it would not be in their best interest to support it.
    &nbsp;

  16. #5116
    I am Murloc! Xuvial's Avatar
    10+ Year Old Account
    Join Date
    Apr 2010
    Location
    New Zealand
    Posts
    5,215
    Well it's that, and that Mantle only supports 2 specific cards...that aren't even out yet.
    WoW Character: Wintel - Frostmourne (OCE)
    Gaming rig: i7 7700K, GTX 1080 Ti, 16GB DDR4, BenQ 144hz 1440p

    Signature art courtesy of Blitzkatze


  17. #5117
    Quote Originally Posted by Xuvial View Post
    Well it's that, and that Mantle only supports 2 specific cards...that aren't even out yet.
    All GCN cards. Meaning entire 7000 series and 200-series.
    I'm also not sure, but I think they said that they want it to be open-sourced but "optimised for GCN". Meaning they may work it in the drivers anyway. Not sure!
    &nbsp;

  18. #5118
    Quote Originally Posted by tetrisGOAT View Post
    All GCN cards. Meaning entire 7000 series and 200-series.
    I'm also not sure, but I think they said that they want it to be open-sourced but "optimised for GCN". Meaning they may work it in the drivers anyway. Not sure!
    Not only the HD 7000 and Rx-2xx cards, but also the APU integrated graphics and the HD8000 OEM cards.

  19. #5119
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by Faithh View Post
    http://i.imgur.com/ICUKRXA.png
    http://nl.hardware.info/reviews/4436...0-+-frametimes

    A 770 2GB is not built for 5760x1080 4xMSAA on crysis3's high texture usage. The poor scaling and high renderingtimes is due to the complete and utter bottlenecking of the memorybus and imageswapping in the GPU memory. You can see the 7990 with it's 384bit bus and 6GB (or whatever it has) surpass the 770 SLI which should be faster.

    Quote Originally Posted by Faithh View Post
    Are you going to fix that latency with Gsync? Obviously there's no vsync there so are you still going to blame vsync? You might have higher gpu rendering time if vsync is enabled that wasn't even my point.
    Was i blaming vsync? I was explaining vsync. Besides, vsync does not increase gpu-rendering (or draw) times. It creates a 'wait'-function on the buffer. The graphics card keeps drawing images at the same speed, and the buffer is only updated when the screen is ready for scanning.

    Quote Originally Posted by Faithh View Post
    My point was that you can't take stuttering totally away by adding a pcb to an output device, doesn't make sense like this.
    Taking that graph I just linked, gsync isn't going to eliminate the high rendering times.
    The graph you show has other issues due to the settings. And nothing, except picking a different graphics card setup, would be able to fix that. Nor would it look good on any displaysetup.

    I still don't think you quite understand the subject. I suggest you look on wikipedia how vsyncing works. It would make it allot clearer for you. Because frankly i think i've digressed on the subject long enough .. (wasn't originally planning to)

  20. #5120
    Quote Originally Posted by Majesticii View Post
    *
    With Gsync, the lag you have is equal to the rendering time of your GPU. (As example: http://content.hwigroup.net/images/a...-bf3-ultra.png)

    Like we haven't heard it enough that people disable Vsync just because the game stutters because of that, but we still can see microstutter or feeling totally sluggish because of high times without vsync.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •