1. #5121
    Brewmaster Majesticii's Avatar
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,413
    We are aware, you gas-guzzling savages.

  2. #5122
    TOTALLY NOT
    Banned
    tetrisGOAT's Avatar
    Join Date
    Jun 2008
    Location
    Sweden
    Posts
    12,707
    I have a bicycle. I'm a total badass.

  3. #5123
    Brewmaster Majesticii's Avatar
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,413
    You'd think people would be more enthousiastic about a solution to all the stuttering and tearing. But people seem very suspicious and reserved (discontent, rather) about the g-sync technology. Especially on the dutch webpages.

    I for one, would buy this in a heartbeat (no i'm not gullible, just excited).
    Last edited by Majesticii; 2013-10-19 at 10:25 AM.

  4. #5124
    Possibly because they don't want to spend another 150-500 Euro on a new monitor, when their current one works perfectly fine?
    Intel i5 2500K (4.5 GHz) | Asus Z77 Sabertooth | 8GB Corsair Vengeance LP 1600MHz | Gigabyte Windforcex3 HD 7950 | Crucial M4 128GB | Crucial M550 256GB | Asus Xonar DGX | Samson SR 850 | Zalman ZM-Mic1 | Western Digital Caviar Blue 500GB | Noctua NH-U12P SE2 | Fractal Design Arc Midi | Corsair HX650

    Tanking with the Blessing of Kings - The TankSpot Guide to the Protection Paladin - Updated for Patch 5.4!

  5. #5125
    Brewmaster Majesticii's Avatar
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,413
    That doesn't make sense. If you dont want to spend money on a new monitor, then don't, why bother to discredit the technology. It's a problem that has been around since...ever. Countless posts about microstutter and tearing later, and now with the first appearance of a solution they try to bury it like the plague. It just seems really weird.

  6. #5126
    TOTALLY NOT
    Banned
    tetrisGOAT's Avatar
    Join Date
    Jun 2008
    Location
    Sweden
    Posts
    12,707
    Tearing and microstutter is not really a monitor issue though. Lazy and uninteresting solution, yet another "buy-a-new-monitor"-technique from nVidia. Those are two reasons why people are sceptical.

  7. #5127
    Brewmaster Majesticii's Avatar
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,413
    Getting a game to run at exactly 60fps is an easy task you say? I would much rather have developers focus on fidelity, than sacrificing it to comply with a standard that has caused issues from the first day; static 60hz refreshrate.
    Last edited by Majesticii; 2013-10-19 at 06:02 PM.

  8. #5128
    Well with gsync you aren't going to fix microstutter issues since microstutter can be present when you don't even have vsync turned on. The time that your gpu needs to render a frame, Gsync isnt going to help really.

  9. #5129
    Brewmaster Majesticii's Avatar
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,413
    Quote Originally Posted by Faithh View Post
    Well with gsync you aren't going to fix microstutter issues since microstutter can be present when you don't even have vsync turned on. The time that your gpu needs to render a frame, Gsync isnt going to help really.
    I think it does. It caculates the refreshrate on each drawcycle. Meaning the monitor displays only the drawn frames, not having sudden duplicates due to the buffer not being ready for a new scan.

    Stutter basicly being a visual representation of the monitor scanning the same image in the buffer due to the videocard not having enough time to update it.

    This static refreshrate of monitors is so normal to us, i dont think anyone realises how much issues it causes.
    Last edited by Majesticii; 2013-10-19 at 06:18 PM.

  10. #5130
    Quote Originally Posted by Majesticii View Post
    I think it does. It caculates the refreshrate on each drawcycle. Meaning the monitor displays only the drawn frames, not having sudden duplicates due to the buffer not being ready for a new scan.

    Stutter basicly being a visual representation of the monitor scanning the same image in the buffer due to the videocard not having enough time to update it.

    This static refreshrate of monitors is so normal to us, i dont think anyone realises how much issues it causes.
    How I'm thinking is:

    - Stutter caused by Vsync
    - "Normal" stutter or rather the latency how long a gpu took to render a single frame which you even have with vsync disabled

    With Gsync theyre only fixing the stutter thats caused by vsync.

  11. #5131
    Brewmaster Majesticii's Avatar
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,413
    Quote Originally Posted by Faithh View Post
    How I'm thinking is:

    - Stutter caused by Vsync
    - "Normal" stutter or rather the latency how long a gpu took to render a single frame which you even have with vsync disabled

    With Gsync theyre only fixing the stutter thats caused by vsync.
    It's very simple, your GPU draws an image at a certain interval which can very (depending on complexity). It stores this image in the imagebuffer when it's completed. Your screen scans this image and thus displays it. What vsync does it make sure that the imagebuffer does not get overwritten when the screen is still busy scanning (16.67ms time duration on a 60hz screen). Which means that whilst the monitor was reading the image, it got overwritten and suddenly starts scanning a different image causing the warped images (or "tearing"). Vsync locks the GPU to only overwrite the buffer on the same intervaltime as the monitor, causing perhipheral input lag (mouse, keyboard) if there is a significant overhead.

    Stutter is basicly the GPU being slower with updating the buffer that the scanningtime of the monitor. So if the GPU takes 25ms to update the buffer, and the scanning frequency of the monitor is 16.67hz, the monitor will scan the same image twice, or three times. Which you can visuall see as the display "hanging", which we call stuttering. But there are no two kinds of stuttering, because there is only one condition in which this occurs.

    HOWEVER, traditional vsync has 2 modes. 30 and 60fps. When the GPU is rendering at 60 or more fps, it locks at 60fps. When the gpu renders at <60fps it swaps to 30fps mode. This transition can also be seen as a stutter, because you have a moment of the screen using the same image a few times. This issue has been somewhat resolved by nvidia's 'adaptive vsync'.

    If you can actively say that the monitor only refreshes when the buffer is written, this is represented as a much more smooth picture. Because there is no tearing or double images. Only an varying imagestream, which to our eyes is perceived as a smooth image.
    Last edited by Majesticii; 2013-10-19 at 06:43 PM.

  12. #5132
    The monitor is synced to the GPU's framebuffer, this does not have anything to do with the slow rendering time of your GPU. It's called for a reason sync, you can't say that such a feature is going to reduce or remove the frame rendering latency. Frame latencies we can see from reviews is just the time that the GPU needs to put this into the framebuffer which is an earlier step. That only can be solved by using a better architecture or driver optimizations.

    If this is supposed to fix all the microstutter issues everyone would have gone mad already instead of seeing this partly as a gimmick. The way nvidia explained "stutter" is really confusing for a lot of people, it's not going to take the crossfire stuttering issues away if those cards were even supported with gsync monitors.

  13. #5133
    Brewmaster Majesticii's Avatar
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,413
    Quote Originally Posted by Faithh View Post
    The monitor is synced to the GPU's framebuffer, this does not have anything to do with the slow rendering time of your GPU. It's called for a reason sync, you can't say that such a feature is going to reduce or remove the frame rendering latency.
    It's not synced to the buffer perse. The buffer is just memory on the graphics card from which your monitor scans/reads an image. If it was synced all the time, you wouldn't have tearing. The only thing you can do is make sure the framebuffer and the monitor share the same timings (vsync). It has everything to do with the GPU rendering time. If the GPU takes longer than the scantime of your monitor, it will not have refreshed the buffer in time. Meaning the monitor will scan the last completed draw. Having a frametimesequence of 16 then 22ms then 18ms is not what you perceive as stutter. It's the visual representation/limitation on the screen which causes this (double imagescans). 16 would mean on time, the 22ms image will be shown twice etc.

    Quote Originally Posted by Faithh View Post
    Frame latencies we can see from reviews is just the time that the GPU needs to put this into the framebuffer which is an earlier step. That only can be solved by using a better architecture or driver optimizations.
    Which is significantly harder, and always results in a lower fidelity. Due to programmers having to compromise visuals for performance to reach the target.

    Quote Originally Posted by Faithh View Post
    If this is supposed to fix all the microstutter issues everyone would have gone mad already instead of seeing this partly as a gimmick. The way nvidia explained "stutter" is really confusing for a lot of people, it's not going to take the crossfire stuttering issues away if those cards were even supported with gsync monitors.
    No it's not confusing, they actually explain it really well. It's the fact that people have misconceptions about the subject which causes the issue. Crossfire stutter is basicly having 16 then 22 then 16 then 22ms render times (for example). Resulting in lots of double scans. Though this being admittedly a driver issue, it would be unnoticeable if refreshrates also adapted accordingly.

    _____________________________<--a bottom line

    I'm not promoting G-Sync here, just excited about the technology. It has taken too long already, resulting in the proprietary module G-Sync.

    - - - Updated - - -

    However, i must add. It is interesting to see how people get hyped over a proprietary API (Mantle) that, if succesful (though i doubt it since microsoft showed no interest), could have a direct influence on how gaming is coded. But get butthurt when nvidia opts for a proprietary displayfeature that would have no direct influence, instead of creating their own API. And in no way would force you use it. It's fairly mindboggling and reeks of bias.

    Let's just say, it's easier to argue about the implications a multitude of API standards would have...
    Last edited by Majesticii; 2013-10-19 at 09:09 PM.

  14. #5134
    The Insane DeltrusDisc's Avatar
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    15,191
    Quote Originally Posted by tetrisGOAT View Post
    I have a bicycle. I'm a total badass.
    Unfortunately, America isn't very bicycle friendly in all parts. To get to work, there are a lot of roads with no sidewalks/bike lanes, thus I'd have to ride along the edge of 50mph zones with TONS of traffic. Fuck that! I'm not the daredevil I once was. ALSO, When half of my shifts don't end until 2:30-3AM... do you really think I want to go riding my bike? >_< Plus, it's been getting real cold lately. Already takes me half an hour to drive home even going 80mph along the freeway, so... imagine how long it would take to get home on a bicycle. x_x 15 miles home to work, ugh. Traffic. Blah.
    i7-5820K | ASUS X99- Deluxe | Crucial 2x8GB DDR4 2133MHz | eVGA GTX 760 SC | Crucial MX100 512GB | Crucial M500 240GB | Crucial m4 128GB | Western Digital Blue 1TB | Western Digital Black 1TB | SeaSonic X660 Gold
    ASUS MX239H | Schiit Stack Modi + Asgard 2 | Audio Technica ATH-AD700 | Presonus Eris E5 Studio Monitors | Blue Snowball Mic | Razer Death Adder | Corsair K70 | CyberPower 1500PFCLCD UPS

  15. #5135
    Quote Originally Posted by Majesticii View Post
    *
    http://i.imgur.com/ICUKRXA.png

    Are you going to fix that latency with Gsync? Obviously there's no vsync there so are you still going to blame vsync? You might have higher gpu rendering time if vsync is enabled that wasn't even my point. My point was that you can't take stuttering totally away by adding a pcb to an output device, doesn't make sense like this.

    Taking that graph I just linked, gsync isn't going to eliminate the high rendering times.

  16. #5136
    It is interesting to see how people get hyped over a proprietary API (Mantle) that, if succesful (though i doubt it since microsoft showed no interest), could have a direct influence on how gaming is coded.
    Microsoft doesn't need to be interested in Mantle for it to work. There will be a low-level API for the XBox no matter what it is called, that API is designed for GCN GPUs.
    Intel i5 2500K (4.5 GHz) | Asus Z77 Sabertooth | 8GB Corsair Vengeance LP 1600MHz | Gigabyte Windforcex3 HD 7950 | Crucial M4 128GB | Crucial M550 256GB | Asus Xonar DGX | Samson SR 850 | Zalman ZM-Mic1 | Western Digital Caviar Blue 500GB | Noctua NH-U12P SE2 | Fractal Design Arc Midi | Corsair HX650

    Tanking with the Blessing of Kings - The TankSpot Guide to the Protection Paladin - Updated for Patch 5.4!

  17. #5137
    TOTALLY NOT
    Banned
    tetrisGOAT's Avatar
    Join Date
    Jun 2008
    Location
    Sweden
    Posts
    12,707
    Microsoft shunning Mantle is less surprising considering their investment in DirectX and wanting to be the superior gaming-platform for PC.
    Their belief in it aside, it would not be in their best interest to support it.

  18. #5138
    I am Murloc! Xuvial's Avatar
    Join Date
    Apr 2010
    Location
    New Zealand
    Posts
    5,036
    Well it's that, and that Mantle only supports 2 specific cards...that aren't even out yet.

  19. #5139
    TOTALLY NOT
    Banned
    tetrisGOAT's Avatar
    Join Date
    Jun 2008
    Location
    Sweden
    Posts
    12,707
    Quote Originally Posted by Xuvial View Post
    Well it's that, and that Mantle only supports 2 specific cards...that aren't even out yet.
    All GCN cards. Meaning entire 7000 series and 200-series.
    I'm also not sure, but I think they said that they want it to be open-sourced but "optimised for GCN". Meaning they may work it in the drivers anyway. Not sure!

  20. #5140
    Quote Originally Posted by tetrisGOAT View Post
    All GCN cards. Meaning entire 7000 series and 200-series.
    I'm also not sure, but I think they said that they want it to be open-sourced but "optimised for GCN". Meaning they may work it in the drivers anyway. Not sure!
    Not only the HD 7000 and Rx-2xx cards, but also the APU integrated graphics and the HD8000 OEM cards.
    Intel i5 2500K (4.5 GHz) | Asus Z77 Sabertooth | 8GB Corsair Vengeance LP 1600MHz | Gigabyte Windforcex3 HD 7950 | Crucial M4 128GB | Crucial M550 256GB | Asus Xonar DGX | Samson SR 850 | Zalman ZM-Mic1 | Western Digital Caviar Blue 500GB | Noctua NH-U12P SE2 | Fractal Design Arc Midi | Corsair HX650

    Tanking with the Blessing of Kings - The TankSpot Guide to the Protection Paladin - Updated for Patch 5.4!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •