1. #1

    So G-sync monitors affect your performance?

    I've recently got around learning more about G-sync technology and what it does for a gamer.

    Yet, for example on the model I liked - Acer Predator XB240HA 144Hz G-Sync
    Comments say people that don't have 980 or more are forced to remove Anti-aliasing all together and even some shadows to get acceptable frames. Sure they also say it looks better than with AA but.. ? It's not even 1440p, it's a 1080p 24 inch monitor so...
    From what I gather the way G-Sync works and the resources used, it puts much more rendering stress on the components in order to display the graphics properly.

    Anyone with a G-Sync monitor can comment on this?
    Intel i7-10700k | ASUS ROG STRIX Z490-F GAMING
    ASUS ROG STRIX RTX 2080 Ti | 2x 8GB Corsair Vengeance LPX DDR4 2400Mhz
    A-Data SSD SP920 256GB | Seagate 1TB 7200rpm | NZXT Kraken x62
    Fractal Design Define R6 TG | be quiet! Dark Power Pro 11 650W | Asus ROG Swift PG279Q
    Razer: Blackwidow Elite | Basilisk Ultimate | Nari Ultimate| Firefly | NZXT Hue+

  2. #2
    Deleted
    Well, according to http://www.g-sync.com, G-Sync does not affect performance in games. It does not add extra input lag nor other performance hits. The only drawback is $$$.

    See the images below


  3. #3
    Whomever is saying this doesn't understand what G-Sync does.

    It sounds like they are talking about V-sync, which does try to do something similar in preventing "tearing" on the screen. It tries to "lock" the frame rate so that the cards isn't changing the picture sent to the monitor halfway through a scan cycle (the tearing) and that can cause jumps/drops in frame rate as it tries to keep the frame pacing consistent (usually either 30 or 60 fps) and introduces more lag. So for it to work well, you pretty much have to have your minimum frame rate greater than what ever it's trying to sync to, which might well necessitate those sorts of compromises for frame rates.

    G-sync is much more sophisticated. Refresh rate is in many respects an artifact from the CRT monitor days, where the speed of the electron beam traversing the tube was generally a fixed rate. G-sync is an attempt to work around the limitations of a refresh rate architecture.

    This is a better explanation of things than I can offer

    http://www.anandtech.com/show/7582/nvidia-gsync-review

  4. #4
    Quote Originally Posted by Azurenys View Post
    Comments say people that don't have 980 or more are forced to remove Anti-aliasing all together and even some shadows to get acceptable frames. Sure they also say it looks better than with AA but.. ? It's not even 1440p, it's a 1080p 24 inch monitor so...
    There's no performance hit from G-sync.

    What the comments probably mean is that without achieving well over 60fps speeds (closer to 150fps) having 144Hz monitor is essentially useless. It takes shitloads of GPU power to run new games at close to 150fps, or even at 100fps with ultra settings.

  5. #5
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Akainakali View Post
    G-sync is much more sophisticated. Refresh rate is in many respects an artifact from the CRT monitor days, where the speed of the electron beam traversing the tube was generally a fixed rate. G-sync is an attempt to work around the limitations of a refresh rate architecture.
    Eh, just wanted to add to this. One of the reasons is CRTs had a refresh rate because it didn't use a sample and hold method and so if the data feed in is too low you'd be getting a blank screen. The beam isn't really the issue more so the phosphor's response time granted nowadays it's the fastest display of all types. Similarly LCDs have a pixel response too which is why a max refresh rate is needed. If the refresh rate is higher than the panel can handle then you get major visual artifacts.

    While variable refresh rates existed about 6 years ago on laptops / eDP, which is why the "no module" G-sync on laptops exist, it's a shameful rename for something that already exists. It was made for power savings, amusingly. G-sync and Free-Sync / Adaptive sync are all still within the limitations of the maximum refresh rate of the panel. There's also manufacturing the controller to handle higher inputs though I won't say I'm an expert at that, it won't surprise me if there is any difficulty going with a higher or no limit.
    Last edited by Remilia; 2015-07-01 at 05:22 PM.

  6. #6
    I believe the laptop screens are also different in that the panels are controlled directly by the graphics chip, where as with a conventional monitor, there's the input connection and the hardware scaler. Which is why the recent G-sync laptops don't need a G-sync module like the standalone monitors do.

    http://www.pcworld.com/article/29287...-rages-on.html

  7. #7
    So, in other words, there's no point in getting one for myself if I'm not switching my 970 for something stronger?
    Intel i7-10700k | ASUS ROG STRIX Z490-F GAMING
    ASUS ROG STRIX RTX 2080 Ti | 2x 8GB Corsair Vengeance LPX DDR4 2400Mhz
    A-Data SSD SP920 256GB | Seagate 1TB 7200rpm | NZXT Kraken x62
    Fractal Design Define R6 TG | be quiet! Dark Power Pro 11 650W | Asus ROG Swift PG279Q
    Razer: Blackwidow Elite | Basilisk Ultimate | Nari Ultimate| Firefly | NZXT Hue+

  8. #8
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Azurenys View Post
    So, in other words, there's no point in getting one for myself if I'm not switching my 970 for something stronger?
    I personally don't care all that much about adaptive refresh rate tech that much, but GPU power wise it really depends on the game. 40-60Hz is apparently the 'sweet spot' in which it has the most effect.
    Quote Originally Posted by Akainakali View Post
    I believe the laptop screens are also different in that the panels are controlled directly by the graphics chip, where as with a conventional monitor, there's the input connection and the hardware scaler. Which is why the recent G-sync laptops don't need a G-sync module like the standalone monitors do.

    http://www.pcworld.com/article/29287...-rages-on.html
    Yes they are different for quite a few aspects. The eDP is the main thing that allows adaptive sync and without it it's useless. AMD first demonstrated FreeSync with a Laptop to begin with showing that the tech already existed quite a long while ago. Then with VESA they implemented it into DP1.2a.

  9. #9
    Quote Originally Posted by Azurenys View Post
    So, in other words, there's no point in getting one for myself if I'm not switching my 970 for something stronger?
    It's supposedly a gamechanger for people who are easily annoyed by screen tearing.

  10. #10
    G-sync is by all accounts terrific if your system isn't consistently producing frame rates beyond that which the monitor refreshes at or is producing inconsistent frame rates in IIRC about the 20-60 fps range.

    It dramatically smooths out things and may let you actually turn up the eye candy since the frame rate variability and lower frame rates won't hurt what you're seeing nearly as much.

    In general it's probably most helpful if your system's frame rates at your settings are marginal, so weaker video cards (within reason) probably benefit more.

    If your system is powerful enough that you're pegging the FPS meter, G-sync isn't going to do much of anything since there is no issue there to correct. However it does have it's limits, if you're < something like 20 FPS, you simply aren't producing enough frames to keep up with the changes and it can't fix that.
    Last edited by Akainakali; 2015-07-02 at 03:56 PM.

  11. #11
    Quote Originally Posted by Akainakali View Post
    G-sync is by all accounts terrific if your system isn't consistently producing frame rates beyond that which the monitor refreshes at or is producing inconsistent frame rates in IIRC about the 20-60 fps range.

    It dramatically smooths out things and may let you actually turn up the eye candy since the frame rate variability and lower frame rates won't hurt what you're seeing nearly as much.

    In general it's probably most helpful if your system's frame rates at your settings are marginal, so weaker video cards (within reason) probably benefit more.

    If your system is powerful enough that you're pegging the FPS meter, G-sync isn't going to do much of anything since there is no issue there to correct. However it does have it's limits, if you're < something like 20 FPS, you simply aren't producing enough frames to keep up with the changes and it can't fix that.
    Well I'm in the range of 60+ frames (probably much more if Vsync was gone) so I suppose I'm going to get this baby soon:
    Acer Predator XB240H 24" G-Sync 144Hz Gaming Widescreen LED Monitor
    http://www.overclockers.co.uk/showpr...odid=MO-079-AC
    Last edited by Azurenys; 2015-07-02 at 06:47 PM.
    Intel i7-10700k | ASUS ROG STRIX Z490-F GAMING
    ASUS ROG STRIX RTX 2080 Ti | 2x 8GB Corsair Vengeance LPX DDR4 2400Mhz
    A-Data SSD SP920 256GB | Seagate 1TB 7200rpm | NZXT Kraken x62
    Fractal Design Define R6 TG | be quiet! Dark Power Pro 11 650W | Asus ROG Swift PG279Q
    Razer: Blackwidow Elite | Basilisk Ultimate | Nari Ultimate| Firefly | NZXT Hue+

  12. #12
    Bloodsail Admiral aarro's Avatar
    10+ Year Old Account
    Join Date
    Nov 2010
    Location
    Grim Batol - eu
    Posts
    1,020
    I have the Asus PG278Q ROG Swift G-Sync 144Hz and you can definitely tell the difference going from a standard monitor to this. Maybe it improves game play slightly but it definitely improves visuals.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •