Page 1 of 7
1
2
3
... LastLast
  1. #1
    I am Murloc! Mif's Avatar
    10+ Year Old Account
    Join Date
    May 2009
    Location
    Tarnished Coast
    Posts
    5,629

    Official news post: Optimization

    https://www.guildwars2.com/en-gb/new...d-performance/

    An average of GPU performance from stress test data:
    Last edited by Mif; 2012-08-22 at 03:39 AM.

  2. #2
    The Lightbringer barackopala's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Chile, Viña del Mar
    Posts
    3,846
    And that's the reason why my game was running a bit slower than intended... stupid GTS 250 ruining my dreams and hopes

  3. #3
    Titan Kalyyn's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    Indiana, US
    Posts
    11,392
    I must say that this game has the best optimization I've ever seen. I was in a group of probably a few hundred people in the stress test about a week ago, and we were determined to crash the server. We all piled up in Divinity's Reach and spammed every ability we had all at once for a good twenty minutes. Not only did we not impact the server in any way, my frame-rate didn't even drop. The same experience in any other game would likely have froze my client. I'm 100% sure that 1/4th as many people doing the same thing in WoW would have disconnected me.

  4. #4
    Scarab Lord Karizee's Avatar
    10+ Year Old Account
    Join Date
    Oct 2011
    Location
    The Eternal Alchemy
    Posts
    4,433
    From the article:

    "If you find yourself suffering from poor performance and nothing in our Knowledge Base relieves the issue, open a support ticket and provide our Support Team with the appropriate information. By providing us with this information, we can work to make your Guild Wars 2 experience a positive one. Just because the bulk of the optimizations have already been implemented doesn’t mean we’re done. Optimizations will continue up to and beyond launch day, using the information that you provide."
    Valar morghulis

  5. #5
    Chart is agnotic to:

    1. Screen resolution
    2. Quality settings

    I wonder if someone really thinks it serves a point to unify people using all resolutions and quality settings, bunch them together and output average FPS. Kind of like bunching all food ingredients in existence together, and outputting the result as edible.

  6. #6
    The Lightbringer barackopala's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Chile, Viña del Mar
    Posts
    3,846
    Quote Originally Posted by Lucky_ View Post
    Chart is agnotic to:

    1. Screen resolution
    2. Quality settings

    I wonder if someone really thinks it serves a point to unify people using all resolutions and quality settings, bunch them together and output average FPS. Kind of like bunching all food ingredients in existence together, and outputting the result as edible.
    It is being setted on auto-detect, so quality settings will be the standard for your gpu, dont know if resolution will be affected by the auto-detecting, it has helped me out on deciding if i should update my vga or not, might want to jump to a GTS450, so, for me it was somewhat useful
    Last edited by barackopala; 2012-08-22 at 03:53 AM.

  7. #7
    Mechagnome Fernling306's Avatar
    10+ Year Old Account
    Join Date
    Oct 2011
    Location
    United States
    Posts
    569
    Would be cool if they told us what the auto detect settings were for each gpu.
    Last edited by Fernling306; 2012-08-22 at 03:55 AM.

  8. #8
    Quote Originally Posted by barackopala View Post
    It is being setted on auto-detect, so quality settings will be the standard for your gpu, dont know if resolution will be affected by the auto-detecting.
    Nope. They specify that data is aggregated from everyone, rather then the small minority that actually used default settings. Which are notably so broken, that they can't even properly detect your monitor size (or at least they couldn't on any machines I tried it on, which range from E-450 netbook to i5 gtx560ti desktop).

  9. #9
    Weird, there's no mention of gtx570 in that chart.

  10. #10
    Quote Originally Posted by Lucky_ View Post
    Nope. They specify that data is aggregated from everyone, rather then the small minority that actually used default settings. Which are notably so broken, that they can't even properly detect your monitor size (or at least they couldn't on any machines I tried it on, which range from E-450 netbook to i5 gtx560ti desktop).
    First graph, yes, second graph, no.

    Quote Originally Posted by Ryngo Blackratchet View Post
    Yeah, Rhandric is right, as usual.

  11. #11
    I don't believe that chart at all.

    I had GTX 560 Ti's in sli and didn't and couldn't maintain 55 fps with vsync on, this says that it should be around 58 fps with just one!

    this what at high settings with FXAA..

  12. #12
    Quote Originally Posted by Lurker87 View Post
    Weird, there's no mention of gtx570 in that chart.
    They might not have had sufficient data of people with an i5 or better with the gtx570.

    Quote Originally Posted by Ryngo Blackratchet View Post
    Yeah, Rhandric is right, as usual.

  13. #13
    Quote Originally Posted by rhandric View Post
    They might not have had sufficient data of people with an i5 or better with the gtx570.
    I have an i5 2500k and a 570 and was getting between 30-60 fps on high. I hope it changes some since 30fps drops are really noticeable.

  14. #14
    Quote Originally Posted by jam3s121 View Post
    I don't believe that chart at all.

    I had GTX 560 Ti's in sli and didn't and couldn't maintain 55 fps with vsync on, this says that it should be around 58 fps with just one!

    this what at high settings with FXAA..
    When vsync was off with 1x 560ti in the last stress test I was able to play I jumped between 60-125 FPS in non-WvW areas.
    Are you using the correct drivers, and was supersampling disabled? (Supersampling could be capping your vram, who knows)

  15. #15
    Quote Originally Posted by Lurker87 View Post
    I have an i5 2500k and a 570 and was getting between 30-60 fps on high. I hope it changes some since 30fps drops are really noticeable.
    One data point isn't sufficient data Not saying you're the only one, but if they needed a certain threshold of data before they included it, and they didn't meet that threshold, that could be why it's not included. Of course, they will still have your data, and hopefully will resolve it -- plus, you can contact the support team and provide them with the necessary information.

    Quote Originally Posted by Ryngo Blackratchet View Post
    Yeah, Rhandric is right, as usual.

  16. #16
    Bah....my 690 isn't on there. Would be nice to see a comparison to the AMD7970.

  17. #17
    That second graph has no context when they don't mention what the Auto Detect settings are for those various video cards. What's the point of showing the FPS differences with varying parameters anyway? Would have made more sense to show what resolution and graphics options were used instead of just saying Auto Detect.

  18. #18
    Quote Originally Posted by jam3s121 View Post
    I don't believe that chart at all.

    I had GTX 560 Ti's in sli and didn't and couldn't maintain 55 fps with vsync on, this says that it should be around 58 fps with just one!

    this what at high settings with FXAA..
    SLI profile for the game is pretty crap from what I know. You might be actually suffering FPS dips due to bad load balancing between two cards. In general nvidia releases a properly balanced SLI profile in drivers within month of release of AAA title. Their numbers are not that far off from what I was seeing on max detail (without fxaa) at 1080p.

    The problem is that even if average is "sort of in the right direction", no one cares about it if the game runs at near 60 fps most of the time, but dips to a third of that every so often. And the article shows that they know about it.

  19. #19
    Herald of the Titans theredviola's Avatar
    15+ Year Old Account
    Join Date
    Jul 2008
    Location
    Arkansas
    Posts
    2,880
    Well that settles that. Guess I'll be getting the AMD 7770 over the GTX 550.
    "Do not only practice your art, but force yourself into its secrets, for it and knowledge can raise men to the divine." -- Ludwig Van Beethoven

  20. #20
    Quote Originally Posted by Lurker87 View Post
    I have an i5 2500k and a 570 and was getting between 30-60 fps on high. I hope it changes some since 30fps drops are really noticeable.
    Thats weird I got the same cpu and gpu, i5 2500k @ 4.5ghz and gainward gtx 570 and I got 80-120 fps depending on where I was, with the highest possible graphics on stress test.
    Last edited by Speedlance; 2012-08-22 at 04:17 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •