Page 23 of 44 FirstFirst ...
13
21
22
23
24
25
33
... LastLast
  1. #441
    Deleted
    Quote Originally Posted by Zeara View Post
    Prob, ah well :P decided to check the idle clocks on my gpu, goes down to 50 mHz xD So the 3dmark not ready yet, sounds most plausible
    Thats also a stage of the low power consumption mode. When the computer is used even less, it drops all the way to 50mhz.

  2. #442
    3Dmark indeed. I'm pretty sure you won't achieve that score using anything less than ~1050-1100 MHz.

  3. #443
    Deleted
    Futuremark has a habbit of showing whatever clocks it feels like. I have 1050/1250 @ 1.268v on my HD6870 but it always shows core clocks to be @ 300. I dont actually ever recall it showing right clocks other than on some old CPU's.

  4. #444
    Quote Originally Posted by Xuvial View Post
    ...what? Are you implying that nVidia have a more powerful single GPU up their sleeve, i.e. the actual GTX680?

    What's going on here :S
    They are very unlikely to release it.

    They just rebranded the GTX 670 Ti to GTX 680 when they realised it beat the HD7970 in gaming, and focused resources elsewhere.
     

  5. #445
    Brewmaster Fierae's Avatar
    10+ Year Old Account
    Join Date
    Jul 2010
    Location
    Northampton, UK
    Posts
    1,331
    Quote Originally Posted by Drunkenvalley View Post
    705 MHz core clock? O_o
    Quote Originally Posted by verba View Post
    Why the fuck would you downclock your GTX680 to 700mhz?
    Quote Originally Posted by Drunkenvalley View Post
    I'm pretty sure it's wildly incorrect, but that's still a weird clock number it's showing.
    ^ This.

    It's currently at stock volts but at 1200Mhz
    Digital Rumination
    Plays: Sylvanas EU - Fierae (Druid) | HotS | EVE | PUBG
    Played: Rift | Guild Wars 2 | SW:TOR | BF4 | Smite | LoL | Skyrim
    Ryzen 1920X - 32GB - 980Ti SLI - PCIE NVMe 1GB SSD - Enthoo Primo - Full WC - 4K

  6. #446
    Deleted
    Quote Originally Posted by tetrisGOAT View Post
    They are very unlikely to release it.

    They just rebranded the GTX 670 Ti to GTX 680 when they realised it beat the HD7970 in gaming, and focused resources elsewhere.
    AFAIK they have/had production issues with the original 680 (gk100 i think, or gk110) either way they cant release something else. And when looking at the specs of the 680 it isnt an highend card (regarding bus width etc)

    Altho it should be regarded as an high end card now, cause of the position its put in.

  7. #447
    Quote Originally Posted by tetrisGOAT View Post
    They just rebranded the GTX 670 Ti to GTX 680 when they realised it beat the HD7970 in gaming, and focused resources elsewhere.
    "Beat" is quite generous really. Most gameplay have them floating within margin-of-error of each other, particularly when both are overclocked. (As the HD 7970 is both easier and better at overclocking so far, it seems.)

  8. #448
    Scarab Lord Wries's Avatar
    10+ Year Old Account
    Join Date
    Jul 2009
    Location
    Stockholm, Sweden
    Posts
    4,127
    Quote Originally Posted by Drunkenvalley View Post
    "Beat" is quite generous really. Most gameplay have them floating within margin-of-error of each other, particularly when both are overclocked. (As the HD 7970 is both easier and better at overclocking so far, it seems.)
    Isn't really left within margin-of-error when all hardware sites independent tests shows it edging out the 7970.

    Sweclockers commented that they felt the 4 phase VRM on the reference GTX 680 could've been the thing that held it back. Might see some great OC results with a better PCB design in the future

  9. #449
    Quote Originally Posted by Drunkenvalley View Post
    "Beat" is quite generous really. Most gameplay have them floating within margin-of-error of each other, particularly when both are overclocked. (As the HD 7970 is both easier and better at overclocking so far, it seems.)
    Generous? Perhaps. Still not untrue.
    It is consistently a top-contender, and is in many cases producing a bit more oomph than the HD7970 even in 2560x1600.
    The only thing that keeps Crossfire HD7970 better than SLI GTX 680 in those resolutions, is the fact that both are close enough, and Crossfire scales better.
    This with very, very poor and flakey drivers from the nVidia booth.

    I also think both companies have stronger single-GPU solutions on their workbenches that we might see, within the same generations.
    (my speculation: HD7990, and the dual GPU solution being the HD7970x2)
     

  10. #450
    Quote Originally Posted by Wries View Post
    all hardware sites independent tests shows it edging out the 7970.
    Edges out (in a variety of stuff anyway), on stock no less, sure. And they both scale quite nice with the OCs it appears as well, but once you've overclocked them to their utmost they're really close to one another, both in MHz and in performance. Well, with some varied results there of course. The HD 7970 nor GTX 680 is guaranteed to overclock well enough, so I'm sure we'll see some varied results depending on the OC the site could push.

    My point being is that I don't find the GTX 680 really getting ahead of the HD 7970 to a point where there is a contest. Drawing a winner of who's top performer is to me a rather moot idea this generation. They're quite blatantly in my eyes equal in that regard. But then you begin adding pricetags and various driver issues, etc, and things change.

    Personally, I'm piss-tired of the random driver-crashes the newer nVidia drivers brought me on the GTX 470 (ie card I had before the HD 7970). And the new power-connectors honestly quite weird me out. Unattractive as all hell. Surprisingly, I find that a real buzzkill, particularly when I hope to jump over to Z77 Sabertooth and a watercooled DCII card in the near future. Then hopefully get cables sleeved, etc. But that's a bit irrelevant.

  11. #451
    Honestly, they are both so close it really doesnt matter which you pick imo. So far, all I have seen, both cards are just above the gtx 580 and are roughly equal to each other.

  12. #452
    Quote Originally Posted by Gsara View Post
    Honestly, they are both so close it really doesnt matter which you pick imo. So far, all I have seen, both cards are just above the gtx 580 and are roughly equal to each other.
    They're more ahead of the GTX 580, although they're not running circles around it.

  13. #453
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,098
    Quote Originally Posted by Drunkenvalley View Post
    They're more ahead of the GTX 580, although they're not running circles around it.
    They may not be running circles, but for someone like me, who is looking on and pondering an upgrade perhaps at some point, maybe for the 670 Ti depending on how they do and doing an SLI of them - it is nice to see how low their power consumption has been tending to be. My 660 watt PSU should be enough by the looks of it if I wanted to make this upgrade. ^_^
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  14. #454
    Warchief sizzlinsauce's Avatar
    10+ Year Old Account
    Join Date
    May 2010
    Location
    Bellforest, Tower state
    Posts
    2,188
    Quote Originally Posted by tetrisGOAT View Post
    Generous? Perhaps. Still not untrue.
    It is consistently a top-contender, and is in many cases producing a bit more oomph than the HD7970 even in 2560x1600.
    The only thing that keeps Crossfire HD7970 better than SLI GTX 680 in those resolutions, is the fact that both are close enough, and Crossfire scales better.
    This with very, very poor and flakey drivers from the nVidia booth.

    I also think both companies have stronger single-GPU solutions on their workbenches that we might see, within the same generations.
    (my speculation: HD7990, and the dual GPU solution being the HD7970x2)
    dwadafuck a faster card than the 680 or 7970 thats single card. yea.... 99% of mmo champ will be skipping those cards seeing as they would be clear near $750-$1k

    and more for those in au/eu

  15. #455
    Epic!
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Hillsborough, CA
    Posts
    1,745
    I think it's very likely the HD 7990 and/or the rumored GTX 690 will be in the $849 range given the cost of 28nm production being passed down to TSMC customers.
    Last edited by kidsafe; 2012-03-27 at 05:19 AM.

  16. #456
    Quote Originally Posted by kidsafe View Post
    I think it's very likely the HD 7990 and/or the rumored GTX 690 will be in the $849 range given the cost of 28nm production being passed down TSMC customers.
    The jump to 28nm just got more disappointing.

  17. #457
    Quote Originally Posted by kidsafe View Post
    I think it's very likely the HD 7990 and/or the rumored GTX 690 will be in the $849 range given the cost of 28nm production being passed down TSMC customers.
    ANd 6 month ago AMD said the cost went down with the dye reduction...... go figure why it cost 500 buck for somethign that should be aroudn 300-350 and then lower the cost of the 6xxx series *shrug*

  18. #458
    Epic!
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Hillsborough, CA
    Posts
    1,745
    TSMC's 28nm process is for the most part pretty unreliable, so a lot of silicon gets recycled from a 300mm wafer. In addition, ATI and Nvidia now have to compete with major ARM SoC designers like Qualcomm for wafers.

  19. #459
    Deleted
    Lol did anyone see the Colorful 680?
    2 8 pin power connectors lol

    Last edited by mmoc2be3b3a67c; 2012-03-27 at 12:21 PM.

  20. #460
    Epic!
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Hillsborough, CA
    Posts
    1,745
    What is that a BIOS selector or a reset button? 8+8 PEG and a billion power phases is entirely unnecessary for the GTX 680 unless you're the LN2/benchmarking type.

    Also I wonder if it's even certain BigK will make it into a consumer product. If Nvidia feels GK104 is good enough, then GK110 could potentially be a Tesla only product. Right now the situation with TSMC stinks and there's no way a >500mm^2 chip will contribute significantly to Nvidia's bottom line.
    Last edited by kidsafe; 2012-03-27 at 12:37 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •