Page 1 of 2
1
2
LastLast
  1. #1
    Immortal Zandalarian Paladin's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Saurfang is the True Horde.
    Posts
    7,936

    Looking for a performing nvidia graphic card

    Good evening guys and gals,

    I bought a gaming PC last august. Up to now I am highly satisfied, but I can't help but feel my computer ain't powerful as it could be. However, I am not a hardware professional (far, very far from that) and I must say I'm more confident to ask you guys here, who spend much more time looking at this than I. Here are the specs I believe you guys will need:

    Processor:
    3,50 gigahertz Intel Core i7-3770K
    64-bit ready
    Multi-core (4 total)
    Hyper-threaded (8 total)

    Main Circuit Drive (Motherboard?):
    Board: ASRock Z77 Extreme4
    Bus Clock: 100 megahertz
    BIOS: American Megatrends Inc. P1.40 05/14/2012

    Graphic Card:
    NVIDIA GeForce GTX 670 Gygabyte OC 2G GDDR5 PCI-Express
    From what I remember from when I bought my computer, it is SLI-ready. Since I have such an option on my computer, I'm rather sure I'd be dumb not to use it. I did a little bit of search and I know that my 2nd graphic card must have the same processor (I think it's the GTX part, so GT won't work) and the same amount of memory (2G DDR5 here). I heard the new GeForce generation (700) was being released soon, and I was expecting to spend roughly 500$ on my 2nd card.

    Which card would be the best to work on my computer for 500$? Should I wait a few months for the 700 technology to be released or will they be completely overpriced?
    Last edited by Zandalarian Paladin; 2013-02-16 at 11:11 PM.

  2. #2
    The Patient
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Denmark
    Posts
    239
    Have you read up on micro-stuttering? I think you should try and do that. I've never tried SLI, but from what I've heard, then theres a chance you might be sensitive to micro-stuttering which appears as small lags.

  3. #3
    Deleted
    in what aspect do you find you current card lacking? (i can run BF3 full ultra at 60 fps pretty much with my 670)
    also tried overclocking it?

    edit: microstutter is allot less of an issue with nvidia cards compared to AMD ones.

  4. #4
    Mechagnome Wolfbear's Avatar
    10+ Year Old Account
    Join Date
    Jan 2012
    Location
    Scotland. Walking distance from England.
    Posts
    693
    When it says processor, it means you can only use another GTX 670. You cannot SLI a 670 and a different card.

  5. #5
    Deleted
    Quote Originally Posted by Wolfbear View Post
    When it says processor, it means you can only use another GTX 670. You cannot SLI a 670 and a different card.
    well it is possible to SLI two different cards. (one to off-load physx to) is it practical , hell no.
    also details on the PSU if you intend to go SLI?

  6. #6
    How are you unsatisfied with performance? If you, say, close all browsers, do you notice any difference?
     

  7. #7
    Deleted
    Quote Originally Posted by shroudster View Post
    well it is possible to SLI two different cards. (one to off-load physx to) is it practical , hell no.
    Thats not SLI, thats just running two cards. 1 doing the regular stuff, the other physx.

  8. #8
    Deleted
    Quote Originally Posted by Zeara View Post
    Thats not SLI, thats just running two cards. 1 doing the regular stuff, the other physx.
    well if memory serves me right it was possible with certain cards but then the higher card would be downclocked to the speccs of the lower card. (however i think that was the case with CF not SLI, same concept might still apply though)
    still more curious in specifics from the OP how he finds his current performance lacking.

  9. #9
    Bloodsail Admiral Killora's Avatar
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    BFE, Montana
    Posts
    1,105
    Quote Originally Posted by shroudster View Post
    well if memory serves me right it was possible with certain cards but then the higher card would be downclocked to the speccs of the lower card. (however i think that was the case with CF not SLI, same concept might still apply though)
    still more curious in specifics from the OP how he finds his current performance lacking.
    Both SLI and crossfire use both cards for graphics calculations. You can't tell it to use one for PhysX and one for normal calculations.

  10. #10
    Immortal Zandalarian Paladin's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Saurfang is the True Horde.
    Posts
    7,936
    Quote Originally Posted by Wolfbear View Post
    When it says processor, it means you can only use another GTX 670. You cannot SLI a 670 and a different card.
    You sure about that? When I went on the GeForce SLI faq, they spoke of "an XXXGT cannot be paired with a XXXGTX in an SLI configuration.", but nothing about same GTX or same GT.
    Quote Originally Posted by shroudster View Post
    well it is possible to SLI two different cards. (one to off-load physx to) is it practical , hell no.
    also details on the PSU if you intend to go SLI?
    My PSU (I think that means power-supply?) is a NZXT 750W

    Quote Originally Posted by tetrisGOAT View Post
    How are you unsatisfied with performance? If you, say, close all browsers, do you notice any difference?
    Basically, the problems I have are not huge. They are mainly a few frame drops when there's a lot of stuff on my screen (only happening in GW2 & Witcher 2). Seeing how these games are getting older, I can't help but feel like I might have a hard time with all the new next-gen games coming soon. I also do a LOT of 3D work (scenes with over 4 million poly in 3DS MAX, or over 10 million in zbrush) and I must say the lag is not really the most enjoyable part of creation. I know that the graphic card might not be the only problem here though, so it's not the main reason I want to boost it.

  11. #11
    Frame drops are expected. I experience them with the GTX 680 as well, which is the best nVidia has to offer in single-GPU at this time.

    However, this is likely to increase, not decrease with SLI. Average much higher, with more fluctuating results inbetween.
     

  12. #12
    The Lightbringer Toffie's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Denmark
    Posts
    3,858
    Quote Originally Posted by Freedom4u2 View Post
    You sure about that? When I went on the GeForce SLI faq, they spoke of "an XXXGT cannot be paired with a XXXGTX in an SLI configuration.", but nothing about same GTX or same GT.

    My PSU (I think that means power-supply?) is a NZXT 750W


    Basically, the problems I have are not huge. They are mainly a few frame drops when there's a lot of stuff on my screen (only happening in GW2 & Witcher 2). Seeing how these games are getting older, I can't help but feel like I might have a hard time with all the new next-gen games coming soon. I also do a LOT of 3D work (scenes with over 4 million poly in 3DS MAX, or over 10 million in zbrush) and I must say the lag is not really the most enjoyable part of creation. I know that the graphic card might not be the only problem here though, so it's not the main reason I want to boost it.
    The fps drops in GW2 is the CPU, getting a better card wont help much at all. For witcher 2, I never go below 70 fps with my gtx 680 - you got ubersampling on? No system can run that shit.
    8700K (5GHz) - Z370 M5 - Mugen 5 - 16GB Tridentz 3200MHz - GTX 1070Ti Strix - NZXT S340E - Dell 24' 1440p (165Hz)

  13. #13
    Immortal Zandalarian Paladin's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Saurfang is the True Horde.
    Posts
    7,936
    Quote Originally Posted by tetrisGOAT View Post
    Frame drops are expected. I experience them with the GTX 680 as well, which is the best nVidia has to offer in single-GPU at this time.

    However, this is likely to increase, not decrease with SLI. Average much higher, with more fluctuating results inbetween.
    Ok, well it seems pretty much unanimous that SLI is not the way to go. I am curious though, and for knowledge only I'd like to know when would SLI be good?

    Quote Originally Posted by Toffie View Post
    The fps drops in GW2 is the CPU, getting a better card wont help much at all. For witcher 2, I never go below 70 fps with my gtx 680 - you got ubersampling on? No system can run that shit.
    Got everything on including ubersampling with GTX 670 and it can run smoothly usually (around 25FPS), but when a lot of things come on screen then it's hell.

  14. #14
    Bloodsail Admiral Killora's Avatar
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    BFE, Montana
    Posts
    1,105
    Quote Originally Posted by Freedom4u2 View Post
    Ok, well it seems pretty much unanimous that SLI is not the way to go. I am curious though, and for knowledge only I'd like to know when would SLI be good?
    SLI isn't bad per se, it's just a nightmare and not all games properly support SLI/CF.

    Though to answer your question, eyefinity/surround is where it's needed mostly.

  15. #15
    Deleted
    GW2 has massive issues which both AMD and Nvidia are trying to resolve. If you read the countless posts in other forums about GW2 you'll understand it's the game and not your hardware which is the issue. I'd certainly not use GW2 as any sensible GPU benchmark. It's an MMO anyway and MMO's are notorious for running poorly on high end hardware.

    Witcher 2 is a very demanding DX9 game and will cripple any system which enables Uber Sampling. I hope you haven't enabled that option? Even without Uber sampling it's a ridiculously demanding game but with your card you should be seeing above 60 FPS providing that you're playing at a resolution thats not above 1920x1080 or 1920x1200.

    In terms of SLI, you'll need to buy the same card. Even with SLI you'll have a massive headache if you're playing recently released games. However Nvidia's 'Geforce Experience Beta' look's promising. It's a system that should have been implemented years ago, but it's a utility that will check your configuration, it will compare it to those with the same configuration and will download the optimal drivers and recommend optimal settings for the games you play based on your hardware. Pretty amazing.
    Last edited by mmoc7f933b7749; 2013-02-17 at 12:48 AM.

  16. #16
    Quote Originally Posted by Zeara View Post
    Thats not SLI, thats just running two cards. 1 doing the regular stuff, the other physx.
    And you still can Sli/Crossfire with different card, even mixing nvidia and AMD
    but...

    You need to use a chip called lucid hydra, but I don't think you gain a benefit from using this and not traditionnal sli/crossfire

    But it still exist
    My first build:
    Storage: Kingston SSD Now V200+ 120G and WD Caviar Blue HDD 500 G
    Processing Units: i5-3570k @ 3.8 Ghz cooled by 212 Evo and MSI Twin Frozr 3, R7850 @ 900, 1200
    Mobo, Ram, PSU Gigabyte Z77-D3H and G.Skill ripjaw 2x 4G with XFX 550w
    If I am unreadable, its not because I hate grammar, its because Im french-canadian

  17. #17
    Immortal Zandalarian Paladin's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Saurfang is the True Horde.
    Posts
    7,936
    Quote Originally Posted by Drudgery View Post
    GW2 is has massive issues which both AMD and Nvidia are trying to resolve. If you read the countless posts in other forums about GW2 you'll understand it's the game and not your hardware which is the issue. I'd certainly not use GW2 as any sensible GPU benchmark. It's an MMO anyway and MMO's are notorious for running poorly on high end hardware.

    Witcher 2 is a very demanding DX9 game and will cripple any system which enables Uber Sampling. I hope you haven't enabled that option? Even without Uber sampling it's a ridiculously demanding game but with your card you should be seeing above 60 FPS providing that you're playing at a resolution thats not above 1920x1080 or 1920x1200.

    In terms of SLI, you'll need to buy the same card. Even with SLI you'll have a massive headache if you're playing recently released games. However Nvidia's 'Geforce Experience Beta' look's promising. It's a system that should have been implemented years ago, but it's a utility that will check your configuration, it will compare it to those with the same configuration and will download the optimal drivers and recommend optimal settings for the games you play based on your hardware. Pretty amazing.
    Your post was incredibly helpful, thanks! So I guess the best option would be to wait until 700 series is out and go for a 780 GTX for next-gen games or should I just stick with my 670?

  18. #18
    Bloodsail Admiral Killora's Avatar
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    BFE, Montana
    Posts
    1,105
    Quote Originally Posted by Freedom4u2 View Post
    Your post was incredibly helpful, thanks! So I guess the best option would be to wait until 700 series is out and go for a 780 GTX for next-gen games or should I just stick with my 670?
    your 670 should power on for awhile yet.

  19. #19
    Quote Originally Posted by Freedom4u2 View Post
    Your post was incredibly helpful, thanks! So I guess the best option would be to wait until 700 series is out and go for a 780 GTX for next-gen games or should I just stick with my 670?
    Definitely stick with it. If you insist on staying with nVidia, you cannot comfortably improve at this point in time.
     

  20. #20
    Deleted
    Yeah, keep your 670. It's a great card. You could wait for the new generation of cards to come out, that won't be anytime soon however since Nvidia are most likely focusing on the mobile market, because that's where the money is. Kepler's power efficiency and performance was a step in the right direction in this regards.

    On Monday we'll see the paper launch of Nvidia's new flagship card, codenamed Titan. That should cause quite a stir. I personally can't wait for Monday. Tuesday we'll probably see some reviews hitting the web and some retailers may get some cards to sell. I know Ebuyer received around 10-20 690 GPU's when they were first launched and their stock hovered around 3-5 units for around 2 weeks. In the US however they had pretty much sold out straight away. Our American cousins are numerous and richer than us it seems.
    Last edited by mmoc7f933b7749; 2013-02-17 at 12:49 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •