1. #1

    Real world effects of Overclocking?

    Just say your over-clock your CPU by 20%. You don't gain 20% more fps do you in actual real world scenarios like while gaming etc. Your benchmarks in programs go up a lot okay sure.

    But say a 2500k @ stock settings coupled with a 580 @ stock settings or something, plays all games at max settings @ 1080p and always maintains at least 60fps. Whats the point of stressing the CPU by overclocking it other than to see your benchmarks results in those benchmarking programs jump up a few hundred or thousand marks?

    Has Overclocking actually ever been a solution to fixing an FPS issue? In reality, how much are you actually gaining from the OC for gaming? And is it really worth adding the stress to the CPU for that few extra frames (if your frames even go higher)

    Is there any bench-marks people have done with say...putting a 2500k or 2600k at stock settings...then recording average FPS in a game...then OC'ing the CPU's and doing another bench-mark to see the average FPS again and compare it?

    I'd really really like to see what real results are for gaming when it comes to overclocking. If the difference in FPS for a 2500k @ 3.30ghz compared to 4.30GHZ is like 2-3 or 5 frames...isn't that slightly pointless?
    Last edited by Muni; 2011-10-26 at 12:04 PM.

  2. #2
    Deleted
    I bought an i7 2600k not too long ago, but am running it at stock clock speeds, simply because the CPU is not the bottleneck. Both the 2500k and 2600k are more than enough to run programs out now. It will improve performance once the cpu becomes a bottleneck. ATM it mainly causes more heat, a bigger electric bill, and something to brag about. Practical applications like video encoding, etc. definately can use stuff like this, so there are applications that can use the resources, you just won't see spectacular improvements in games, etc.

  3. #3
    It depends on the game. Some games (such as WoW) are more dependent on CPU power than GPU power... others like recent Battlefield games tend to scale well with both CPU and GPU power. Capturing video of gameplay puts a pretty decent load on your CPU on top of whatever the game uses if you do that.

    There's also hardware longevity to consider. Not "you're running it hotter/faster than it should, you're decreasing the lifespan" but "will it run games in X years?"... and as long as you don't go super crazy with voltages the decrease in lifespan of the hardware shouldn't be significant enough to require replacing before it'd be completely obsolete anyways.

  4. #4
    Has anyone done any bench-marking tests in actual games (and not just score testing programs) for average fps while running stock speeds and OC'd speeds on the 2500k/2600k?

    I'd really like to see what the real increase in performance is on real tests (other than some guy on a forum saying 'I CAN SEE THE HUGE INCREASE HURRRRRRRRRRRRR')

  5. #5
    Scarab Lord Djinni's Avatar
    10+ Year Old Account
    Join Date
    May 2009
    Location
    West Sussex, UK
    Posts
    4,232
    Quote Originally Posted by Muni View Post
    Has anyone done any bench-marking tests in actual games (and not just score testing programs) for average fps while running stock speeds and OC'd speeds on the 2500k/2600k?

    I'd really like to see what the real increase in performance is on real tests (other than some guy on a forum saying 'I CAN SEE THE HUGE INCREASE HURRRRRRRRRRRRR')
    There's very little point, becuase given a decent GPU (Nvidia GTX 560Ti/570/580 say) your going to hit the max framerate(refresh rate) limit for your screen in most games anyway, regardless of your OC.

    Asumming you want o mesure the real world effect... there is none... becuase if your screen can only show 60/120 frames per second anything over that would be wasted (ie have no real world effect)

    I can do some when I get back (maybe in 4 or 5 hours time) if you really want... but like I said above... stock settings on both those CPU's are plenty for most games.

    As pak52b said:
    Quote Originally Posted by pak52b View Post
    Practical applications like video encoding, etc. definately can use stuff like this, so there are applications that can use the resources, you just won't see spectacular improvements in games, etc.
    And your comparison should include more than games alone... a higher clock speed means you can effectively run more more operations in a shorter time, effectively increaseing your multi-tasking ability. (Meaning your whole system becomes more responsive.. not just your games)
    Last edited by Djinni; 2011-10-26 at 01:26 PM.

  6. #6
    Quote Originally Posted by Muni View Post
    Has anyone done any bench-marking tests in actual games (and not just score testing programs) for average fps while running stock speeds and OC'd speeds on the 2500k/2600k?

    I'd really like to see what the real increase in performance is on real tests (other than some guy on a forum saying 'I CAN SEE THE HUGE INCREASE HURRRRRRRRRRRRR')
    Hurrrr all you want, still does not mean you're right.

    Minimum fps in WoW depends on CPU, and if you overclock by 20%, your minimum fps increases by 20%. Maximum and average fps is interesting only for meter maids looking at benchmarks, minimum fps makes all the difference between playable and unplayable when the game lags the most.
    Never going to log into this garbage forum again as long as calling obvious troll obvious troll is the easiest way to get banned.
    Trolling should be.

  7. #7
    Herald of the Titans Sephiracle's Avatar
    10+ Year Old Account
    Join Date
    Aug 2010
    Location
    Colorado
    Posts
    2,729
    I don't think you'll find anyone that tests what you actually want. Not very many people overclock a processor just for games, they do it for an overall performance increase, which is why those benchmarks are important.

    WoW and other MMO's happen to require a solid processor and speed in order to remove it from being a possible bottleneck rather than games like BF3 or Rage requiring a solid graphic card.
    LoL: Kr1sys
    WoW:'06 - '11 '14-?' : Krisys - Blood/Frost DK | Sephiracle - Arms/Prot Warrior | Sephyx - Shadow/Disc Priest | Petergriffin - Huntard


  8. #8
    Quote Originally Posted by Djinni View Post
    There's very little point, becuase given a decent GPU (Nvidia GTX 560Ti/570/580 say) your going to hit the max framerate(refresh rate) limit for your screen in most games anyway, regardless of your OC.

    Asumming you want o mesure the real world effect... there is none... becuase if your screen can only show 60/120 frames per second anything over that would be wasted (ie have no real world effect)

    I can do some when I get back (maybe in 4 or 5 hours time) if you really want... but like I said above... stock settings on both those CPU's are plenty for most games.

    As pak52b said:


    And your comparison should include more than games alone... a higher clock speed means you can effectively run more more operations in a shorter time, effectively increaseing your multi-tasking ability. (Meaning your whole system becomes more responsive.. not just your games)
    Would really like to see your tests if you end up doing them

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •