Page 1 of 2
1
2
LastLast
  1. #1

    gtx 670 gigabyte 3x to gtx 770 gigabyte 3x (all factory OC'd)

    Think it's worth it? I mean I bought my 670 exactly a year ago..ish. but I think it might be a good purchase if I want to play BF4, Witcher 3, and possibly Watch Dogs @ 1080p with 60fps ya know?

    -shrug-

    Rest of my rig is

    i5 3570k
    8gb ddr 3
    650w Corsair TX series

    That PSU is such a boss for the price and wattage.

    Arg. Buy the 770 or buy an android Tablet....D:

    PS: Waiting for 8xxx AMD is out of the question. I don't use AMD hardware.

  2. #2
    No, it's a rip-off card if you upgrade from a 670.

  3. #3
    Misunderstood...nvm previous post. But if you really want to upgrade, I would suggest picking up a second 670 at some point (used would probably be better as it's pretty much the worst price/performance card ever).
    Last edited by PBitt; 2013-07-22 at 02:46 AM.

  4. #4
    Quote Originally Posted by PBitt View Post
    Um....3 670s in SLI should be more than enough to handle those games on Ultra and then some...in fact 2 of them is probably enough, though I'm unsure of how well those perform in 3-way SLI on some games. And if you're going to spend that kind of money just because, I'm pretty sure 2 780s would give better performance and longevity than 3 770s. In pretty much every scenario (and some scale better than others of course), the third card doesn't really add as much % of a performance increase as the second one. If you really want new cards I think 780s in SLI would be more than enough, since you should still be able to get some money for those three cards selling them on ebay. And I don't think the 770 is a big enough upgrade from the 670 anyways.
    I think he means with "3x" the amount of fans the windforce cooler has..

  5. #5
    Im quite sure he wouldn't be on a 650W PSU on tri SLI either lol. I would not upgrade to a 770 from 670. You will hardly see a difference. Going to a 780 would be a big difference, but not a 770.

  6. #6
    Moved to the sub-forum.
     

  7. #7
    I figured as much.. I guess I COULD wait and see what AMD's 8970 ghz addition is like down the road.. I just hate AMD so much. I love how nvidia always has drivers out for new games, and AMD goes months without seeing anything. I dunno if they've stepped up their game cuz I went from 4890 for a while, to 6870 with absolute SHITASTIC driver support the entire couple of years.

    Can anyone tell me that AMD have stepped up their game? or are they as crap as ever.. I don't want this to be a flame war it's a legitimate question, and I left them for solid reasons.

    edit: NVM... just saw their latest stable driver is may lololol.

    Guess I'll save up another little bit and go for a 780.

  8. #8
    Quote Originally Posted by Safetytorch View Post
    I figured as much.. I guess I COULD wait and see what AMD's 8970 ghz addition is like down the road.. I just hate AMD so much. I love how nvidia always has drivers out for new games, and AMD goes months without seeing anything. I dunno if they've stepped up their game cuz I went from 4890 for a while, to 6870 with absolute SHITASTIC driver support the entire couple of years.

    Can anyone tell me that AMD have stepped up their game? or are they as crap as ever.. I don't want this to be a flame war it's a legitimate question, and I left them for solid reasons.

    edit: NVM... just saw their latest stable driver is may lololol.

    Guess I'll save up another little bit and go for a 780.
    I've actually never had problems with AMD drivers. These last couple Nvidia drivers though.....ugh. BF3 was unplayable on my brand new 780 until 320.49 because of heavy artifacting, exactly like this:


  9. #9
    Stood in the Fire slasher0161's Avatar
    10+ Year Old Account
    Join Date
    May 2010
    Location
    North QLD, Australia
    Posts
    425
    I am yet to have issues with AMD drivers and the beta drivers are rock solid, the only time I have had an issue with amd drivers was with my htpc running ubuntu but that was mainly due to some weird scaling of the resolution on the tv, however a quick search found another driver that resolved that (using a richland apu). Personally a 770 isn't enough of an upgrade to worry about it unless you step up to a 780 and overclock (770 has FA overclock head room).
    Personal rig:
    • i5-3570k (4.2ghz) || CM hyper 212 evo || Asrock extreme 4 || Corsair (2 x 4gb 1600mhz) ram
    • Samsung 840 (120gb) || WD blue 1tb || WD green 1tb
    • Powercolor 7870xt || Silverstone strider 500w ||NZXT source 210

  10. #10
    The Lightbringer Toffie's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Denmark
    Posts
    3,858
    Never had issues with my 7970, solid performance in all games. 780 however was a hell for me, had to wait for the new drivers to even get to play games at decent frames and bad flickering. I think the shitty Nvidia drivers may have damaged my card a little...

    AMD will optimize almost all new games coming up since developers will target consoles firstly on AMD hardware. I'm changing for the 9970 which according to reliable rumor should provide 15% higher performance as the Titan. You should wait for AMD and see IMO. Jus't don't get a 770, you wont notice any decent improvement. Just overclock your card if you aren't happy by the performance.
    8700K (5GHz) - Z370 M5 - Mugen 5 - 16GB Tridentz 3200MHz - GTX 1070Ti Strix - NZXT S340E - Dell 24' 1440p (165Hz)

  11. #11
    Deleted
    Quote Originally Posted by Toffie View Post
    I'm changing for the 9970 which according to reliable rumor should provide 15% higher performance as the Titan.
    I would love to see that rumor. Because I doubt that. Last I read about it was that it is GCN and on 28 nm. So Im expecting an increase, but not by that much. I hope so tho

  12. #12
    The Lightbringer Toffie's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Denmark
    Posts
    3,858
    Quote Originally Posted by Zeara View Post
    I would love to see that rumor. Because I doubt that. Last I read about it was that it is GCN and on 28 nm. So Im expecting an increase, but not by that much. I hope so tho
    I read the rumor from chiphell, which every rumor have turned out to be true.
    The card have been sent for testing so specs and performance should be out in a couple of weeks.
    8700K (5GHz) - Z370 M5 - Mugen 5 - 16GB Tridentz 3200MHz - GTX 1070Ti Strix - NZXT S340E - Dell 24' 1440p (165Hz)

  13. #13
    Nvidia has stuff like TXAA as well, which is the best AA I've ever seen. It makes AC3 look like sex. Their constant driver support for new games is a big seller for me. Let's get back OT here I don't want to talk about AMD and get a flameware started. The only thing that would get me to switch is if their 8970 (Their x970 is usually priced between nvidias x70 and x80) is a sizable jump about the gtx 780.

    I guess I'll judge when I get BF4.. If I'm sitting in the low 50's on ultra settings (I get 60-70 on bf3 @ 1080p ultra in 64 player servers) then I'll probably get the 770 or 780 depending on how much money I have at the time. Must have 60 fps with eye candy in fps games...

    -INb4 somebody wants to correct me on you can't tell the difference between 30 and 60 or the eye can only see 24 please see this http://boallen.com/fps-compare.html and https://frames-per-second.appspot.com/
    ^i need to post this every time I talk about fps or somebody brings it up.
    Last edited by Laeryn; 2013-07-23 at 01:55 AM.

  14. #14
    1) GTX 770 would be a miniscule upgrade and complete waste of money, since the GTX 770 is basically identical to the GTX 680 with higher memory frequencies.

    2) BF4 will be an AMD-title, will be very tightly optimised for AMD-cards and developed on them

    3) Although Witcher 3 isn't out, but they will be developed for next-gen consoles, and they are all running AMD. Witcher 2 ran better with AMD.

    4) Your aversion to AMD is confusing. It looks like you've decided to hate them. Why? I've had worse experience with nVidia-drivers on any single card I've had than total on the AMD-side. I'm currently using a GTX 680.
     

  15. #15
    Deleted
    Quote Originally Posted by tetrisGOAT View Post
    1) GTX 770 would be a miniscule upgrade and complete waste of money, since the GTX 770 is basically identical to the GTX 680 with higher memory frequencies.

    2) BF4 will be an AMD-title, will be very tightly optimised for AMD-cards and developed on them

    3) Although Witcher 3 isn't out, but they will be developed for next-gen consoles, and they are all running AMD. Witcher 2 ran better with AMD.

    4) Your aversion to AMD is confusing. It looks like you've decided to hate them. Why? I've had worse experience with nVidia-drivers on any single card I've had than total on the AMD-side. I'm currently using a GTX 680.
    we've seen that with farcry 3 aswell , just because they are involved with it don't mean it will be better for sure.
    as for the OP why isn't SLI 670 an option?
    also on a side note , how about overclocking? (easy 5-15% performance gain)

  16. #16
    Overclocking video cards and me have a bad history. I suppose I could just do it, it's not like I can't afford to buy a new card if I fry this thing. I've just never been able to find a sweet spot. I always bump it up a bit, test for 30 mins, bump it, test, etc, and i always get artifacting and crashes. Put me on a CPU and I can make it sing, but gpu's and I don't get along in that regard. And I don't really want to buy a new PSU And completely rewire my rig if i don't have to. I doubt my 650 corsair psu could handle two 670's? It might be pushing it a bit.

    Just because AMD's stamp is on it doesn't mean it'll run better man. Look at any bench, frostbite runs better on Nvidia's by a pretty large margin, especially when MSAA is involved. Frostbite 3.0 is the SAME engine with some destruction and lighting improvements. Building a new engine from the ground up and tilting it's performance towards AMD is not what happened here.

    It's all experience. I've never had bad Nvidia drivers. I remember being on my AMD 4890 when Bad Company 2 came out, and there was an issue with AMD cards that made map loading take ludicrous amounts of time, and it took them like 6 months to fix it. I shit you not. Nvidia woulda been on top of that right away. Like i said this isn't blind fanboy hate, I choose my hardware on experience, and I've always had great luck with nvidia. Just like I only buy gigabyte hardware when possible, because it's just been so awesome to me. Besides, hopefully more games in the future will have TXAA, the nvidia exclusive anti-aliasing, which is by far the best I've ever seen.

    So what are people using for OC'ing gpu's these days? still using msi afterburner?
    Last edited by Laeryn; 2013-07-23 at 06:11 PM.

  17. #17
    Quote Originally Posted by Safetytorch View Post
    Besides, hopefully more games in the future will have TXAA, the nvidia exclusive anti-aliasing, which is by far the best I've ever seen.
    HardOCP's Crysis 3 settings analysis shows that there is a large performance penalty for using TXAA and its image quality is subpar.

    After trying out TXAA in AC3, I agree with HardOCP's analysis of TXAA; TXAA makes everything blurry.

    TXAA is a flop but Nvidia seems to spearhead a lot of AA technologies (FXAA, CSAA)

    CSAA looks amazing (1:15)

    Quote Originally Posted by Safetytorch View Post
    So what are people using for OC'ing gpu's these days? still using msi afterburner?
    EVGA Precision has a nice UI. MSI Afterburner's time charts can be found in GPU-Z.
    Last edited by yurano; 2013-07-23 at 06:39 PM.

  18. #18
    Deleted
    Quote Originally Posted by Safetytorch View Post
    Just because AMD's stamp is on it doesn't mean it'll run better man. Look at any bench, frostbite runs better on Nvidia's by a pretty large margin, especially when MSAA is involved. Frostbite 3.0 is the SAME engine with some destruction and lighting improvements. Building a new engine from the ground up and tilting it's performance towards AMD is not what happened here.
    Im not sure what you on about. But Id say they are pretty even. 670 and 7970, even. 7970 GHz and 680/770, pretty much even. Only the 760 pulls ahead a bit more compared to the 7950 and 660ti.

    http://www.techpowerup.com/reviews/M..._Gaming/8.html

    I dont care if dont like AMD, but nvidia isnt pulling ahead of AMD in BF3.

  19. #19
    Quote Originally Posted by Safetytorch View Post
    Overclocking video cards and me have a bad history. I suppose I could just do it, it's not like I can't afford to buy a new card if I fry this thing. I've just never been able to find a sweet spot. I always bump it up a bit, test for 30 mins, bump it, test, etc, and i always get artifacting and crashes. Put me on a CPU and I can make it sing, but gpu's and I don't get along in that regard. And I don't really want to buy a new PSU And completely rewire my rig if i don't have to. I doubt my 650 corsair psu could handle two 670's? It might be pushing it a bit.

    Just because AMD's stamp is on it doesn't mean it'll run better man. Look at any bench, frostbite runs better on Nvidia's by a pretty large margin, especially when MSAA is involved. Frostbite 3.0 is the SAME engine with some destruction and lighting improvements. Building a new engine from the ground up and tilting it's performance towards AMD is not what happened here.

    It's all experience. I've never had bad Nvidia drivers. I remember being on my AMD 4890 when Bad Company 2 came out, and there was an issue with AMD cards that made map loading take ludicrous amounts of time, and it took them like 6 months to fix it. I shit you not. Nvidia woulda been on top of that right away. Like i said this isn't blind fanboy hate, I choose my hardware on experience, and I've always had great luck with nvidia. Just like I only buy gigabyte hardware when possible, because it's just been so awesome to me. Besides, hopefully more games in the future will have TXAA, the nvidia exclusive anti-aliasing, which is by far the best I've ever seen.

    So what are people using for OC'ing gpu's these days? still using msi afterburner?
    Well you can't fry a card with voltages being locked and yes a 650W is more than enough to handle 2x 670's in SLI.

  20. #20
    The reason a lot of people haven't seen many AMD issues with drivers as of late is that AMD decided a while back to stop doing MONTHLY driver updates and use 3-6 month block intervals for updates, meaning rather than firing out test builds nonstop they are releasing optimized software. Simply put by slowing down they greatly improved.

    - - - Updated - - -

    Quote Originally Posted by Faithh View Post
    Well you can't fry a card with voltages being locked and yes a 650W is more than enough to handle 2x 670's in SLI.
    Thats pushing the envelope really really tight 2 670's pull over 340w by themselves. Mobo and cpu push you just under 500w if you factor every last piece that draws power you end up w/ like 65w of headroom b4 overclocking anything.

    So actually NO you don't have enough for 2 670's unless you aren't overclocking at which time I'd wonder why you even bothered to buy a second gpu.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •