⛥⛥⛥⛥⛥ "In short, people are idiots who don't really understand anything." ⛥⛥⛥⛥⛥
[/url]
⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥ ⛥⛥⛥⛥⛥⛥⛥⛥⛥⛥
I'm aware of the difference in the cards. However at the moment GW2 has a bigger bottleneck in the proccessor than in the GPU. This is somewhat typical of MMO's in general, GPU's are rarely stressed compared to what you would see in a modern day FPS. A boost in CPU will generaly result in a higher performance gain than a GPU upgrade will.
From the beta there were people with very expensive and new video cards geting crap performance because their CPU's were hovering down at the low end of requirements.
Who is John Galt?
Has anybody played the game with a Radeon HD 7770? I was just wondering what the performance is like on medium + settings
Definitely take atleast AMD HD7850. Your display will be the next upgrade most likely as the prices drop down. . . 6790 won't run x1080 nicely.
This guy gets it.
Upgrading you VC would see improvements (and you should upgrade it), but the majority of your money should be thrown at a processor.
Get a 2500k (or a 3570k if your mobo has a bios update to handle the new chips [they use the same socket]), then use what money you have left to get a marginal GPU upgrade.
As a personal anecdote, during BWE2 my CPU was running at 95%-99% at all times, but my GPU was barely over 60% (and it's core temp underlined that).
For an MMO (esp GW2), CPU trumps GPU. TL;DR - A $200 CPU upgrade will benefit you more than a $400 GPU upgrade.
Last edited by Grraarrgghh; 2012-07-09 at 07:12 PM.
Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA
He already has a quad core CPU. A new GPU + PSU would be his best investment.
Wait 2 more weeks for Nvidia to reveal the 660 and then depending on price buy one of them or a 7850 and a new PSU.
People were experiencing problems in the beta because the game wasn't optimized.
Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA
Well, it's not for the faint of heart but there are ways to unlock Gateway BIOSes (though they are risky ofc). So yeah, that's pretty much out. I do stand by my assertation that a 2300 may hold back the game (since again, MMOs are CPU dependant) at best, and at worst may do that and bottleneck any new GPU he purchases, making his money spent to output ratio weaker.
Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA
Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA
Cant speak to that, my monitor does not support resolutions that high. A somewhat more reasonable resolution might improve your your FPS. As I said 1920X1080 had little to no FPS issues on the same CPU and a GPU thats not only 2 generations back but a weaker variant.
There is also one other issue I see. You've got your CPU overclocked up to 4.4ghz? They put up a specific warning in their known issues that overclocking tended to have a negitive effect on performance with the builds from BWE1 and 2. If they still have that warning up for the next event and you still see FPS issues try returning your clock settings to default.
Who is John Galt?
I won't scale down a modern game simply because they developers are too lazy to optimize it. A 3GB 384-bit DDR5 based 7970 should be able to handle that resolution just fine. Taking a ~78% leap in terms of pixel increase shouldn't hurt your framerate to that extent - at least it doesn't in any other title I've tried.
Overclocking having a negative impact makes no sense. Really. Please link to some sources?
Seems unfair to base a conclusion regarding performance on a beta release anyway.
Last edited by mmoc7c6c75675f; 2012-07-09 at 08:42 PM.
It will once you enable things like Ambient occlusion, anti-aliasing above MSAA 2x, transparency AA, and soft shadows (ie card-dependant renders). TBH most people I know who run 2560 do so on a XFire/SLI setup.
MMOs are tricky because of the view distance. On a higher rez monitor, your card will be forced to render a shitton more triangles because of the increased horizontal space alone.
Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA
If I recall correctly, graphics in gw2 has little to do with your graphics card, and more so with your processing power, etc. this way the game works well for people regardless of what system they are on. It sucks if you're on a high end gaming computer with low processing power, but this wasn't designed to run better because you have a better graphics card.
Essentially, no. A 7970 is a powerful enough card to handle most, if not all titles I have installed. The only exception is Battlefield 3, which on hectic 64-player maps needs to be dialed down a few steps. The only reason to need CrossFire or SLI (with current high-end cards) would be for stereoscopic viewing or multi-monitor setups (realistically).
I'm running Aion, Diablo III, Max Payne 3, Sniper Elite V2, Endless Space, Dirt 3, Blacklight: Retribution, Ghost Recon: Future Soldier, Crysis 2, WoW, HoN, Smite Beta, and so on - all on maxed out settings (or close to) at my native resolution without any issues at all. Actually, the only game I can think of that gives me issues is GW2, again, beta.
Last edited by mmoc7c6c75675f; 2012-07-09 at 08:57 PM.
http://wiki.guildwars2.com/wiki/Beta/BWE2_known_issues
specifically
They had this on the official forums as well but those are still down till the next event. For now the official wiki will have to suffice as a source link. you are right about beta release though the game is not optimized yet so performance should improve once it is. This is probably also of intrestCurrently, adjusting the core clock speed of the GPU or CPU typically results in lower performance.
It is advised that you leave core clock speeds at factory defaults.
We are currently optimizing Guild Wars 2, making ongoing improvements to the game’s frame rate. Presently, the game is CPU-bound on most high-end systems, so lowering graphics settings or upgrading your video card may not have a large impact on performance.
Who is John Galt?
Supersampling and 4x + MSAA add a shitton to your video load, and games with lesser video optimization like Metro 2033 are a better comparison for GW2 (not for actualy graphical fidelity, shit no, GW2 looks like ass in fidelity comparison).
Aion is a better comparison because it's a modern high-texture MMO, but it's also nearly 4 goddamn years old.
Corsair 500r - i5-3570k@4.8 - H100i - 580 DirectCUII - Crucial M4Lenovo y580 - i7-3630QM - 660M - Crucial M4 mSATA
First and foremost, I find it odd that you are defending a game that essentially should run better than it does on such resolution. I've tried a multitude of different settings and adjustments to no avail, but as I said in a previous post - this is a beta.
Also, I find it extremely odd that overclocking would cause a negative effect. Essentially what differs a i5 2400 and a i5 2500 is an overclock; without the extra core speed these two CPUs are the same - they could even come from the same wafer. Even if this is the case, there is no excuse to it.
Supersampling, MSAA, or any other fancy setting - never claimed these were the culpit. The beta runs like shit on my system, no matter what setting I use, considering the hardware I'm running. At lowest settings I'm getting around 60-80 frames per second, a number that should be much, much higher. I'm looking forward to the completed game as much as you do, but this is not alright no matter how you flip and turn it.
The engine in it's current state is in dire need of optimisation. Short of being in area with a low polygon count, you can't get a constant 60fps on any hardware setup right now.
Anybody saying otherwise really isn't running a constant FPS counter at the top, nor are they monitoring it at all times.