Page 2 of 23 FirstFirst
1
2
3
4
12
... LastLast
  1. #21
    Epic!
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Hillsborough, CA
    Posts
    1,745
    Quote Originally Posted by DeltrusDisc View Post
    Your stuff certainly looks maxed via MSI Afterburner. ;p
    I'm just saying someone else could probably post a 1300MHz HD 7970 score with ease.

  2. #22
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,086
    Oh I know. ^_< I saw the overclocks some people got when the NDA lifted. ;p
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  3. #23
    I am Murloc! Fuzzykins's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    South Korea
    Posts
    5,222
    So... why do we double the memory clock? It's listed as 2200Mhz, why not just enter that? I realize it's DDR2, so in reality it is doubled, but if it's listed on EVERY source as 2200Mhz, why not just enter that?

    I'm not asking why the clock is doubled, I'm asking why this is a necessary step. (And the reason I said that was GPU killing is because I was under the assumption that you listed it from Afterburner.)

  4. #24
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,086
    If it honestly becomes an issue I will put numbers like 2200 in the future.
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  5. #25
    I am Murloc! Fuzzykins's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    South Korea
    Posts
    5,222
    Quote Originally Posted by Ghâzh View Post
    Memory Clock OC: 2200MHz
    Either way, I'll be snagging first in 560ti when I devote some more time to it.

  6. #26
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,086
    Quote Originally Posted by Fuzzykins View Post
    Either way, I'll be snagging first in 560ti when I devote some more time to it.
    I will destroy you!
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  7. #27
    Scarab Lord Wries's Avatar
    10+ Year Old Account
    Join Date
    Jul 2009
    Location
    Stockholm, Sweden
    Posts
    4,127
    Quote Originally Posted by Fuzzykins View Post
    So... why do we double the memory clock? It's listed as 2200Mhz, why not just enter that? I realize it's DDR2, so in reality it is doubled, but if it's listed on EVERY source as 2200Mhz, why not just enter that?
    It's my understanding that the actual clock of say GDDR "4000MHz" is 1000MHz. It's quad data rate. The reason of it reading 2000 in say MSI AB etc being a lingering imprint of when graphics cards had DDR3, which was dual data rate.

  8. #28
    Pit Lord Ghâzh's Avatar
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Helsinki, Finland
    Posts
    2,329
    Quote Originally Posted by Fuzzykins View Post
    Either way, I'll be snagging first in 560ti when I devote some more time to it.
    I didn't really put much thought on the memory frequency. Nor did I milk the abosolute best core clock either as I could still comfortably rise the voltage to 1.15 if need to. Weirdly though the GPU was behaving quite strangely at different voltage points.

    About the time investment required for the benchmarks. Sure you can run them all in under 15 minutes but to seriously get the best out of your card it takes hours of fidling and fine tuning. Especially when GPUs are so sensitive when it comes to even slight changes in voltage and frequency. I think I spent good 2-3 hours on mine .

    Quote Originally Posted by Fuzzykins
    So... why do we double the memory clock? It's listed as 2200Mhz, why not just enter that? I realize it's DDR2, so in reality it is doubled, but if it's listed on EVERY source as 2200Mhz, why not just enter that?
    Original system specs of 560 TI reference lists the memory clock at 4008MHz . Like wries said, it's GDDR5 memory so the "real" clock would be 1002MHz which is quadrubled.
    Last edited by Ghâzh; 2012-02-14 at 01:27 PM.

  9. #29
    Herald of the Titans Will's Avatar
    10+ Year Old Account
    Join Date
    May 2010
    Posts
    2,675
    Great thread for those of you who wish to achieve three things in one sitting:

    a) subscribe to the "e-peenish" 'bigger numbers are better' attitude.
    b) shorten the life of your card, and possibly void any associated warranties, too.
    c) Gain performance that you probably do not need. Some of you benefit from an overclock but most people don't actually need to bother, yet do it anyway because bigger = better, right?

    Course, if we're talking about an older card, then fine, fine. Cheaper than buying a new one. Never have been a fan of the additional ruddy noise, though.

    * I'd like to take the time to mention that one does best in not interfering with how others choose to use their parts.
    To use the horrid tired car analogies again, it's like not wanting a BMW/Mercedes/Audi just because a KIA will take you there as well, for less.
    I'd also like to point out that the ePeen-card is close to be considered trolling.
    Last edited by BicycleMafioso; 2012-02-14 at 02:05 PM.

  10. #30
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,086
    Quote Originally Posted by Ghâzh View Post
    I didn't really put much thought on the memory frequency. Nor did I milk the abosolute best core clock either as I could still comfortably rise the voltage to 1.15 if need to. Weirdly though the GPU was behaving quite strangely at different voltage points.

    About the time investment required for the benchmarks. Sure you can run them all in under 15 minutes but to seriously get the best out of your card it takes hours of fidling and fine tuning. Especially when GPUs are so sensitive when it comes to even slight changes in voltage and frequency. I think I spent good 2-3 hours on mine .
    Need I remind you how long it takes to successfully tweak a CPU's OC not on auto? And then the testing of it is considerably longer, typically, than the testing of a GPU. :P
    Last edited by BicycleMafioso; 2012-02-14 at 02:01 PM.
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  11. #31
    Pit Lord Ghâzh's Avatar
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Helsinki, Finland
    Posts
    2,329
    Can be time consuming if you make it to be It's just that with CPU's you're working with bigger numbers and wider margin. 10MHz difference in GPU clocks is bigger thing than with CPU where you usually take it to the next even hundred like from 4700 to 4800MHz. And GPUs just are more sensitive to overclocking, sometimes it feels like a wind from wrong direction can make perfectly stable overclock unstable .

  12. #32
    Quote Originally Posted by Will View Post
    Great thread for those of you who wish to achieve three things in one sitting:
    I don't see the problem with that. I mean, I purchased the HD 7970. At that point I'm being ridiculous anyway.

    That said, I'm doing some touchups on my overclock. I can't max out even ol' AMD Overdrive, which was a bit of a buzzkill. I'd almost be considering returning the card for a more ridiculous one if it wasn't still a pretty darn beastly overclock imo.

  13. #33
    Quote Originally Posted by Drunkenvalley View Post
    I can't max out even ol' AMD Overdrive, which was a bit of a buzzkill. I'd almost be considering returning the card for a more ridiculous one if it wasn't still a pretty darn beastly overclock imo.
    Hmm that's weird; there really shouldn't be a problem with any cards hitting 1125/1575 when you max the power slider to +20% in Overdrive.

    Maybe you just got one of the ones that have a really high ASIC % but a low stock voltage like 1.12v since mine is at 1.17v.
    3930K . RIVE . 2x 7970 . Cosmos II . AX1200 . 3x Dell U2312hm
    Computer Setup / DH / huntard

  14. #34
    My "ASIC quality" reads a whopping 58.7% on GPU-Z.

  15. #35
    Oh then you probably got RNG'd =/

    I thought mine was low at 69% but it's been OC'ing pretty well so far.
    3930K . RIVE . 2x 7970 . Cosmos II . AX1200 . 3x Dell U2312hm
    Computer Setup / DH / huntard

  16. #36
    Deleted
    GPU OC eh... I'm not gonna be lazy and entermy pre-OC'd hawk, that'd be lame Good luck to the OCers, may the best win ^^

  17. #37
    Pit Lord Ghâzh's Avatar
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Helsinki, Finland
    Posts
    2,329
    Hawk is only 950 core and 2100 memory. There's plenty of room! Should be a good platform to start as they're supposed to be better binned .

  18. #38
    Epic!
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Hillsborough, CA
    Posts
    1,745
    Quote Originally Posted by Kristie View Post
    Oh then you probably got RNG'd =/

    I thought mine was low at 69% but it's been OC'ing pretty well so far.
    I have one of those 1.112V cards. It just seems to be luck of the draw whether they OC well or not...at least on air.

    It's rumored the 7970's have OVP set around VID+.125V. I went past that on the Afterburner slider, but unless you have the Asus 7970 BIOS, you don't get the actual voltage reading (w/droop) of your 7970. I used the Asus BIOS for a while, but it has a bug with PowerPlay...
    Last edited by kidsafe; 2012-02-15 at 12:31 AM.

  19. #39
    here is my settings currently ill update with a benchmark if you guys would like



    setup
    i7 2600k 4.4 atm
    7970 max overclock
    asus maximus iv extreme z
    8 gigs gskill 1600mhz
    custom water on cpu only for now till new project is completed
    seasonic 1250 gold psu
    haf x case again for now till new project is completed


    *edit* my apology just re-read the first post will update with screenshots of benchmarks etc tomorrow
    Last edited by Flcrewpolo; 2012-02-15 at 04:33 AM.

  20. #40
    Deleted
    Name: Notarget
    GPU Brand Name: AMD
    GPU: Radeon HD 6850
    Core Clock OC: 940MHz
    Memory Clock OC: 4600MHz (1150MHz)
    Voltage: 1.172
    Cooler: ASUS DirectCU
    Link to 3DMark score: http://3dmark.com/3dm11/2763820
    Link to SS of Unigine Heaven 2.5 or later results: http://i.imgur.com/pnT8l.jpg
    Link to SS of Desktop: http://i.imgur.com/RTIDk.jpg

    ---------- Post added 2012-02-15 at 08:39 AM ----------

    Quote Originally Posted by Ghâzh View Post
    Name: Gházh
    Link to SS of Desktop: http://i.imgur.com/wvUTH.jpg
    Time to upgrade from the 1680x1050? I'm just teasing...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •