Page 2 of 5 FirstFirst
1
2
3
4
... LastLast
  1. #21
    Legendary! MonsieuRoberts's Avatar
    10+ Year Old Account
    Join Date
    May 2010
    Location
    Weeping Squares, Vilendra, Solus
    Posts
    6,621
    Quote Originally Posted by Merendel View Post
    My newer computer is runing an i5-3570k, 8gigs of DDR3 and a somewhat older Radeon HD 5770. Got to try it out during the most recent stress test and it handled even a zergfest WvW on max settings at a completely smooth framerate, I ran it at 1920X1080.
    http://www.hwcompare.com/6809/geforc...adeon-hd-5770/
    ⛥⛥⛥⛥⛥ "In short, people are idiots who don't really understand anything." ⛥⛥⛥⛥⛥
    [/url]
    ⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥ ⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥

  2. #22
    Quote Originally Posted by MonsieuRoberts View Post
    I'm aware of the difference in the cards. However at the moment GW2 has a bigger bottleneck in the proccessor than in the GPU. This is somewhat typical of MMO's in general, GPU's are rarely stressed compared to what you would see in a modern day FPS. A boost in CPU will generaly result in a higher performance gain than a GPU upgrade will.

    From the beta there were people with very expensive and new video cards geting crap performance because their CPU's were hovering down at the low end of requirements.

    Who is John Galt?

  3. #23
    Deleted
    Has anybody played the game with a Radeon HD 7770? I was just wondering what the performance is like on medium + settings

  4. #24
    Definitely take atleast AMD HD7850. Your display will be the next upgrade most likely as the prices drop down. . . 6790 won't run x1080 nicely.

  5. #25
    Quote Originally Posted by Merendel View Post
    I'm aware of the difference in the cards. However at the moment GW2 has a bigger bottleneck in the proccessor than in the GPU. This is somewhat typical of MMO's in general, GPU's are rarely stressed compared to what you would see in a modern day FPS. A boost in CPU will generaly result in a higher performance gain than a GPU upgrade will.

    From the beta there were people with very expensive and new video cards geting crap performance because their CPU's were hovering down at the low end of requirements.
    This guy gets it.

    Upgrading you VC would see improvements (and you should upgrade it), but the majority of your money should be thrown at a processor.

    Get a 2500k (or a 3570k if your mobo has a bios update to handle the new chips [they use the same socket]), then use what money you have left to get a marginal GPU upgrade.

    As a personal anecdote, during BWE2 my CPU was running at 95%-99% at all times, but my GPU was barely over 60% (and it's core temp underlined that).

    For an MMO (esp GW2), CPU trumps GPU. TL;DR - A $200 CPU upgrade will benefit you more than a $400 GPU upgrade.
    Last edited by Grraarrgghh; 2012-07-09 at 07:12 PM.
    Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4
    Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA

  6. #26
    Deleted
    He already has a quad core CPU. A new GPU + PSU would be his best investment.

    Wait 2 more weeks for Nvidia to reveal the 660 and then depending on price buy one of them or a 7850 and a new PSU.


    People were experiencing problems in the beta because the game wasn't optimized.

  7. #27
    Quote Originally Posted by Pyre Fierceshot View Post
    He already has a quad core CPU. A new GPU + PSU would be his best investment.

    Wait 2 more weeks for Nvidia to reveal the 660 and then depending on price buy one of them or a 7850 and a new PSU.


    People were experiencing problems in the beta because the game wasn't optimized.
    If his motherboard has overclocking functions, a 2500k OCd to 3.7/3.8 with a stock fan would blow a 2300 (locked, no 'k') away. The 2500k already gives ~15-20% performance gain in games vs a 2300, overclocked expect to see 30-35% gains.
    Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4
    Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA

  8. #28
    Quote Originally Posted by Grraarrgghh View Post
    If his motherboard has overclocking functions, a 2500k OCd to 3.7/3.8 with a stock fan would blow a 2300 (locked, no 'k') away. The 2500k already gives ~15-20% performance gain in games vs a 2300, overclocked expect to see 30-35% gains.
    It's a gateway which means the Bios is probably locked so forget about overclocking. I would get a new PSU and GPU and you should be set for awhile.

  9. #29
    Quote Originally Posted by lockedout View Post
    It's a gateway which means the Bios is probably locked so forget about overclocking. I would get a new PSU and GPU and you should be set for awhile.
    Well, it's not for the faint of heart but there are ways to unlock Gateway BIOSes (though they are risky ofc). So yeah, that's pretty much out. I do stand by my assertation that a 2300 may hold back the game (since again, MMOs are CPU dependant) at best, and at worst may do that and bottleneck any new GPU he purchases, making his money spent to output ratio weaker.
    Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4
    Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA

  10. #30
    Deleted
    Quote Originally Posted by Merendel View Post
    From the beta there were people with very expensive and new video cards geting crap performance because their CPU's were hovering down at the low end of requirements.
    I've been playing the beta for a while. Max settings, 2560x1440 (coupled with a 7970 1.1GHz and a 3570k at 4.4GHz) - I'm lucky if my average fps stays above 40.

  11. #31
    Quote Originally Posted by Marest View Post
    I've been playing the beta for a while. Max settings, 2560x1440 (coupled with a 7970 1.1GHz and a 3570k at 4.4GHz) - I'm lucky if my average fps stays above 40.
    2560x1440 is pretty ridiculous TBH, thats like 1,612,800 pixels larger than 1080p, and 1,922,400 pixels larger than 1680x1050 (the average screen res worldwide ATM). More than twice the 1680 pixel count in fact.

    Could have something to do with your avg fps.
    Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4
    Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA

  12. #32
    Quote Originally Posted by Marest View Post
    I've been playing the beta for a while. Max settings, 2560x1440 (coupled with a 7970 1.1GHz and a 3570k at 4.4GHz) - I'm lucky if my average fps stays above 40.
    Cant speak to that, my monitor does not support resolutions that high. A somewhat more reasonable resolution might improve your your FPS. As I said 1920X1080 had little to no FPS issues on the same CPU and a GPU thats not only 2 generations back but a weaker variant.

    There is also one other issue I see. You've got your CPU overclocked up to 4.4ghz? They put up a specific warning in their known issues that overclocking tended to have a negitive effect on performance with the builds from BWE1 and 2. If they still have that warning up for the next event and you still see FPS issues try returning your clock settings to default.

    Who is John Galt?

  13. #33
    Deleted
    I won't scale down a modern game simply because they developers are too lazy to optimize it. A 3GB 384-bit DDR5 based 7970 should be able to handle that resolution just fine. Taking a ~78% leap in terms of pixel increase shouldn't hurt your framerate to that extent - at least it doesn't in any other title I've tried.

    Overclocking having a negative impact makes no sense. Really. Please link to some sources?

    Seems unfair to base a conclusion regarding performance on a beta release anyway.
    Last edited by mmoc7c6c75675f; 2012-07-09 at 08:42 PM.

  14. #34
    Quote Originally Posted by Marest View Post
    I won't scale down a modern game simply because they developers are too lazy to optimize it. A 3GB 384-bit DDR5 based 7970 should be able to handle that resolution just fine. Taking a ~78% leap in terms of pixel increase shouldn't hurt your framerate to that extent - at least it doesn't in any other title I've tried.
    It will once you enable things like Ambient occlusion, anti-aliasing above MSAA 2x, transparency AA, and soft shadows (ie card-dependant renders). TBH most people I know who run 2560 do so on a XFire/SLI setup.

    MMOs are tricky because of the view distance. On a higher rez monitor, your card will be forced to render a shitton more triangles because of the increased horizontal space alone.
    Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4
    Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA

  15. #35
    Quote Originally Posted by MonsieuRoberts View Post
    Gateway DX4850-43c. Windows 7 64 bit, Intel i5 2300, GeForce GT 420 1GB GPU & 6GB DDR3 RAM. 1600x900.

    WTB 60 FPS in GW2. Show me what I need to buy, oh Gods of MMO-Champ Computer forums.


    If you need any other info, let me know and I'll get back to this thread in about an hour, at work atm.
    If I recall correctly, graphics in gw2 has little to do with your graphics card, and more so with your processing power, etc. this way the game works well for people regardless of what system they are on. It sucks if you're on a high end gaming computer with low processing power, but this wasn't designed to run better because you have a better graphics card.

  16. #36
    Deleted
    Quote Originally Posted by Grraarrgghh View Post
    It will once you enable things like Ambient occlusion, anti-aliasing above MSAA 2x, transparency AA, and soft shadows (ie card-dependant renders). TBH most people I know who run 2560 do so on a XFire/SLI setup.

    MMOs are tricky because of the view distance. On a higher rez monitor, your card will be forced to render a shitton more triangles because of the increased horizontal space alone.
    Essentially, no. A 7970 is a powerful enough card to handle most, if not all titles I have installed. The only exception is Battlefield 3, which on hectic 64-player maps needs to be dialed down a few steps. The only reason to need CrossFire or SLI (with current high-end cards) would be for stereoscopic viewing or multi-monitor setups (realistically).

    I'm running Aion, Diablo III, Max Payne 3, Sniper Elite V2, Endless Space, Dirt 3, Blacklight: Retribution, Ghost Recon: Future Soldier, Crysis 2, WoW, HoN, Smite Beta, and so on - all on maxed out settings (or close to) at my native resolution without any issues at all. Actually, the only game I can think of that gives me issues is GW2, again, beta.
    Last edited by mmoc7c6c75675f; 2012-07-09 at 08:57 PM.

  17. #37
    Quote Originally Posted by Marest View Post
    I won't scale down a modern game simply because they developers are too lazy to optimize it. A 3GB 384-bit DDR5 based 7970 should be able to handle that resolution just fine. Taking a ~78% leap in terms of pixel increase shouldn't hurt your framerate to that extent - at least it doesn't in any other title I've tried.

    Overclocking having a negative impact makes no sense. Really. Please link to some sources?

    Seems unfair to base a conclusion regarding performance on a beta release anyway.
    http://wiki.guildwars2.com/wiki/Beta/BWE2_known_issues

    specifically
    Currently, adjusting the core clock speed of the GPU or CPU typically results in lower performance.

    It is advised that you leave core clock speeds at factory defaults.
    They had this on the official forums as well but those are still down till the next event. For now the official wiki will have to suffice as a source link. you are right about beta release though the game is not optimized yet so performance should improve once it is. This is probably also of intrest
    We are currently optimizing Guild Wars 2, making ongoing improvements to the game’s frame rate. Presently, the game is CPU-bound on most high-end systems, so lowering graphics settings or upgrading your video card may not have a large impact on performance.

    Who is John Galt?

  18. #38
    Quote Originally Posted by Marest View Post
    Essentially, no. A 7970 is a powerful enough card to handle most, if not all titles I have installed. The only exception is Battlefield 3, which on hectic 64-player maps needs to be dialed down a few steps. The only reason to need CrossFire or SLI (with current high-end cards) would be for stereoscopic viewing or multi-monitor setups (realistically).

    I'm running Aion, Diablo III, Max Payne 3, Sniper Elite V2, Endless Space, Dirt 3, Blacklight: Retribution, Ghost Recon: Future Soldier, Crysis 2, WoW, HoN, Smite Beta, and so on - all on maxed out settings (or close to) at my native resolution without any issues at all. Actually, the only game I can think of that gives me issues is GW2, again, beta.
    Supersampling and 4x + MSAA add a shitton to your video load, and games with lesser video optimization like Metro 2033 are a better comparison for GW2 (not for actualy graphical fidelity, shit no, GW2 looks like ass in fidelity comparison).

    Aion is a better comparison because it's a modern high-texture MMO, but it's also nearly 4 goddamn years old.
    Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4
    Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA

  19. #39
    Deleted
    First and foremost, I find it odd that you are defending a game that essentially should run better than it does on such resolution. I've tried a multitude of different settings and adjustments to no avail, but as I said in a previous post - this is a beta.

    Also, I find it extremely odd that overclocking would cause a negative effect. Essentially what differs a i5 2400 and a i5 2500 is an overclock; without the extra core speed these two CPUs are the same - they could even come from the same wafer. Even if this is the case, there is no excuse to it.

    Supersampling, MSAA, or any other fancy setting - never claimed these were the culpit. The beta runs like shit on my system, no matter what setting I use, considering the hardware I'm running. At lowest settings I'm getting around 60-80 frames per second, a number that should be much, much higher. I'm looking forward to the completed game as much as you do, but this is not alright no matter how you flip and turn it.

  20. #40
    The engine in it's current state is in dire need of optimisation. Short of being in area with a low polygon count, you can't get a constant 60fps on any hardware setup right now.

    Anybody saying otherwise really isn't running a constant FPS counter at the top, nor are they monitoring it at all times.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •