As you probably can detect from the title, I'm in the process of upgrading my GPU. And upon gazing over this post you can probably detect that I'm one of those over thinkers, who takes a simple enough concept and blow it out of proportions with added layers of complexity. If it makes you more comfortable, I'm pretty confident I'll go for the 770, I'm just looking for input in case I missed something (as much as one strives for omnipotency, it's a fools quest).
I had expected to keep my trusted MSI 560 Ti for another year, and then upgrade to the 800-series, but the autumn is loaded with titles I want to experience without having to play on low settings. I'm especially looking forward to the new Dragon Age title, as well as Mass Effect 4 whenever it will be released.
Both titles will use the Frostbite 3 engine, which can bring my current 560 Ti to its knees (googling a random bench puts it at 16fps in BF4 Ultra 4xMSAA/HBAO), while the 770 gets a playable 44fps. That bench seems to illustrate my pondering quite nicely, since the 760 gets 32fps, which isn't enough for a competitive FPS, but to me it's fully acceptable in a single player RPG. I've also included the 280x since it's a bit cheaper than the 770, but with seemingly equal or greater performance (I suspect due to Mantle and 3Gb?).
I've checked up on DirectX 12, and it'll be backwards compatible with current Nvidia cards. For whatever that's worth. I'm unsure what I'll be missing if I'm rocking a AMD card at that point. I guess it's a question of Mantle now, or DirectX 12 a year and a half from now?
Vegas12 render times is somewhat important to me too. Not exactly something I'm using today, but I don't want to close that door. And ananadtech reports the 280x being more than twice as fast vs the 770, and I just can't seem to think there's something wrong with their bench. It also reports 770 being ~x2 faster than the 560Ti, despite me having read that the 600- and 700-series are inferior to the higher end of the 500-series (starting with the 560 Ti) when it comes to Vegas12 encoding speeds. Anyone who can shed some light on this?
Also, I was wondering. Some bench suites seem to include performance per dollar, and per watt. The P/D is obviously reflecting the market, but the P/W is the one that got me wondering. I'm assuming they just take the highest recorded FPS, and divide it with the wattage, or something in a similar fashion. My question is, for instance, if my 560 Ti is working at 100% and achieves 50fps, surely it draws more power than a 770 working at ~70% due to the 60fps vsync? Bench suites don't seem to take vsync into account, they just mash the 100% performance/draw. Or am I mistaken? To me it just doesn't seem to reflect the real world the way they do the comparisons. I mean, you'd find me play with vsync on in a competitive game over my dead body, but when you're playing a Tower Defense game at 900fps on a 60hz monitor, surely I'm not the only one who'd rather have lower power draw, a cooler gaming room and less humming above those 1k FPS?
So, to summarize:
*If it matters, it'll be a MSI Twin Frozr edition (I've already checked for availability), regardless of which of the three it'll be.
*I'm gaming on 1080p, I don't intend to go higher until next upgrade.
*This is a single player eye candy game upgrade, when I play multiplayer it's competitive, and I would play it at 240p if it gave me an advantage.
*Budget is... Well, I prefer MSI TF 770 or cheaper. But if I can pay 20% more for a 20% upgrade, I won't hesitate. Or 50% more for a 50% upgrade, and so on, and so forth.
Thanks for reading it ALL! And any input / thoughts / comments is greatly appreciated!
TL;DR:
Read it all.