Page 1 of 2
1
2
LastLast
  1. #1

    Does overclocking turbo mode do anything?

    I built a secondary gaming PC for the girlfriend/friends/guests to use when theyre over at my place, also as a backup for me if I need.

    So I found a 2700k on half price sale at a local pc store and bought it asap. Now Ive installed it on a asus motherboard. Its still on default clock atm 3.50ghz, but the asus has the turbo boost set to 43+ (4.30ghz boost) and from tests ive done it is hitting 4.3ghz and the monitors are showing that during tests. I installed my old cooler onto it (a h-100) and at 4.30ghz boost it gets to about 25-35c max across all cores.

    I put the turbo boost mode upto 47+ (4.70ghz turboboost mode) and it ran all fine, temps were about 40-55c during max load max now though during turbo boost at 4.7ghz.

    Ive left it at 4.70ghz turboboost oc, is there actually any point to this - the monitors say its boosting to 4.70ghz during load...but is this actually doing anything compared to say a straight out removing of turboboost and having a normal straight out OC?

    With temps of 40-55c on load while turboboosting at 4.70ghz with a H-100...I assume I could pretty easily put this to 4.90-5.00ghz?

  2. #2
    Deleted
    BIOS overclocking yields higher results.
    Software Overclocking yields decent results, but not as good as BIOS clocking.
    Intel Turbo boost yields decent results and also less power consumption.


    All forms of processor overclocking will not give any benefit to fps whatsoever in any game.

    A stock i7 at 3.4 ghz will produce exact same fps as a i7 at 4.5 ghz with the same graphics card.

    You will, however see a large increase in performance when doing tasks such as zipping files and video compression etc.

  3. #3
    Quote Originally Posted by gamingmonitor View Post
    All forms of processor overclocking will not give any benefit to fps whatsoever in any game.

    A stock i7 at 3.4 ghz will produce exact same fps as a i7 at 4.5 ghz with the same graphics card.
    Isn't this incredibly false? And I mean, extremely-totally-wrong type of false?

    In several games (WOW INCLUDED) you can get capped by your CPU and your FPS will drop in both PVP and raid situations, specially when a lot of entities are on screen. It is foolish to say you won't notice a difference. Foolish and greatly incorrect.

  4. #4
    Deleted
    Quote Originally Posted by Snorkle View Post
    Isn't this incredibly false? And I mean, extremely-totally-wrong type of false?

    In several games (WOW INCLUDED) you can get capped by your CPU and your FPS will drop in both PVP and raid situations, specially when a lot of entities are on screen. It is foolish to say you won't notice a difference. Foolish and greatly incorrect.
    While Synthaxx is correct, simple fact of the matter is, overclocking your cpu will see almost zero benefit to most games.

    It's one of the most common misconceptions in PC gaming that overclocking your cpu will yield you any better results whatsoever to your fps. It wont. (in 99% of situations).

    overclock.net/t/665579/core-i7-overclocking-and-gaming-performance

  5. #5
    So by testing games that are not CPU intensive, but GPU intensive, he proves that there is no such thing as a CPU intensive game?

    FASCINATING. TELL ME MORE!

  6. #6
    The Lightbringer Uggorthaholy's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    Weatherford, TX
    Posts
    3,169
    I can't even begin to explain how wrong you are, gamingmonitor.

    The fact of the matter is, if your game has its FPS limited by the CPU, OC'ing provides direct benefit.

    Take, for example, WoW.

    Let's say the rig in question has a 680. The GPU will have 0 issues and handle everything you throw at it.

    Now, let's put 2500k in the system.. At stock clocks, 25 man raiding, full ultra, you WILL have frame drops. I promise. Let's say that your frames dip to 40 under heavy AoE.

    Now, you take your 3.7 turbo and OC it to 4.5. This is a ~21.5% increase.

    Your frame dips should (not 100% accurate, theoretical), only go down about to 48.5 FPS.

    And, even if you are getting 100 FPS, let's say you are on a 120 hz monitor.

    If you juice up a 20% FPS increase, you've now taken it to 120 FPS. This is of course assuming you are solely CPU limited. OCing your processor yields direct results.
    Last edited by Uggorthaholy; 2012-07-22 at 11:43 AM.

  7. #7
    Immortal Evolixe's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    In the Shadows
    Posts
    7,364
    It's just an example right? I have no idea how wow is ever going to eat 100% of off a 4500MHz core

  8. #8
    The Lightbringer Uggorthaholy's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    Weatherford, TX
    Posts
    3,169
    Quote Originally Posted by Holo View Post
    It's just an example right? I have no idea how wow is ever going to eat 100% of off a 4500MHz core
    It's 100% accurate. Raid 25 man Ultraxion on full ultra at 1080p. Watch frames drop.

    WoW is coded poorly and extremely CPU dependent for rendering particle effects.

  9. #9
    Immortal Evolixe's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    In the Shadows
    Posts
    7,364
    I'm scared now

    Thankfully i'm not so much into pve as i'm into pvp. Which is less of a clusterfuck usually

  10. #10
    The Lightbringer Uggorthaholy's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    Weatherford, TX
    Posts
    3,169
    Quote Originally Posted by Holo View Post
    I'm scared now

    Thankfully i'm not so much into pve as i'm into pvp. Which is less of a clusterfuck usually
    PvP is fine until you get involved with large scale world stuff. Even some major battles in AV will cause it, but you typically won't see it. If you only arena, instead of having 25 people casting spells, and a boss with massive spell effects, you have 10 people using abilities. That's a lot less to render.
    Last edited by Uggorthaholy; 2012-07-22 at 10:22 PM.

  11. #11
    Deleted
    Quote Originally Posted by uggorthaholy View Post
    I can't even begin to explain how wrong you are, gamingmonitor.

    The fact of the matter is, if your game has its FPS limited by the CPU, OC'ing provides direct benefit.

    Take, for example, WoW.

    Let's say the rig in question has a 680. The GPU will have 0 issues and handle everything you throw at it.

    Now, let's put 2500k in the system.. At stock clocks, 25 man raiding, full ultra, you WILL have frame drops. I promise. Let's say that your frames dip to 40 under heavy AoE.

    Now, you take your 3.7 turbo and OC it to 4.5. This is a ~21.5% increase.

    Your frame dips should (not 100% accurate, theoretical), only go down about to 48.5 FPS.

    And, even if you are getting 100 FPS, let's say you are on a 120 hz monitor.

    If you juice up a 20% FPS increase, you've now taken it to 120 FPS. This is of course assuming you are solely CPU limited. OCing your processor yields direct results.

    I will eat my hat if you can get a 20 fps increase anywhere on wow solely by overclocking your cpu.

    Sorry, but i call baloney on your theorycraft.


    My i7 2600k yields me almost identical fps in raids whether i overclock it or not. At 3.4 ghz stock speeds i am raiding at 40-80 fps in ultra and when i clock it to 4.0 ghz i am raiding at 40-80 fps in ultra.

    Makes very little difference, at least for me. Real World benchmarks would concur with my own results too, as ive already linked.

    In fact, the only time i can see a 5-10 fps increase in WoW is when i overclock my gpu by another 100 mhz.
    Last edited by mmocc74cd461dd; 2012-07-22 at 11:55 AM.

  12. #12
    Immortal Evolixe's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    In the Shadows
    Posts
    7,364
    Quote Originally Posted by uggorthaholy View Post
    PvP is fine until you get involved with large scale world stuff. Even some major battles in AV will cause it, but you typically won't see it. If you only arena, instead of having 25 people casting spells, and a boss with massive spell effects, you have 10 people using abilities. That's a lot less to render.
    It's just registering the spells going in and out at the same time, and apparently bigger numbers make it harder on the CPU so that would be why it's becoming a bigger strain every expansion..

    However, i don't think the CPU should ever have anything to do with the actually rendering of the image.. it's just that it can hold back the GPU on doing it's job when it's under full load??

    Atleast that is how i always thought it worked.. but i'm still to be called a rookie on those things.

  13. #13
    Deleted
    Quote Originally Posted by Holo View Post
    It's just registering the spells going in and out at the same time, and apparently bigger numbers make it harder on the CPU so that would be why it's becoming a bigger strain every expansion..

    However, i don't think the CPU should ever have anything to do with the actually rendering of the image.. it's just that it can hold back the GPU on doing it's job when it's under full load??

    Atleast that is how i always thought it worked.. but i'm still to be called a rookie on those things.

    Pretty much correct.

    Loss of fps will happen when a part of the system bottlenecks. You even get fps dips when your hard drive is bottlenecking and is loading alot of items.

    Fact of the matter is with cpu's, if you have a decent cpu such as an i5 or i7, there are very few moments where it will ever be a bottleneck in games.


    Overclocking poor cpu's will give benefit to a games fps, sure, but poor cpu's tend to not overclock very well either.

    Overclocking good cpu's wont benefit to a games fps, even though they can be overclocked very well, because even at stock speeds they arent even close to bottlenecking your system.

  14. #14
    Deleted
    Quote Originally Posted by Holo View Post
    However, i don't think the CPU should ever have anything to do with the actually rendering of the image..
    Actually it does. The GPU needs instructions on what its supposed to render, which the CPU provides. In fact, I fired up The Witcher 2 - and nabbed its thread data.

    You'll note there are 3 threads under noticable use. 2 game, and 1 for the GPU driver. That last one is responsible for feeding the GPU, and as you can see, its not exactly under light load.

    Overall, between a quarter to a third of the game's CPU usage is for communication with the GPU.

    Reducing the screen resolution (in my case from 2560x1440) will increase the CPU load of all threads, (Framerate of course improving considerably) and with usages over 12%. To put that in perspective, the GPU needs more than half of a stock 3570K/2500K core just for instructions.
    Last edited by mmoca371db5304; 2012-07-22 at 02:28 PM.

  15. #15
    Deleted
    when i used to play WoW just after i built my system i was actually somewhat displeased with my framerates at stock. 2500k and r6870 was getting around 60-120 depending on the area but dropping like mad to around 20-30 during my 25 man raids. overclocked my cpu to 4.8ghz and i was shocked by the results - it never dropped below 60, even on intense stacking phases with a lot of effects.

    it might not work for every game but there definitely are some cases where it does help you get a more stable frame rate during high action scenarios. never had a bluescreen, overheating or other problems in the past year of being overclocked so i can't see why you wouldn't if you have an after market cooler.

  16. #16
    Mechagnome
    10+ Year Old Account
    Join Date
    Nov 2011
    Location
    Finland
    Posts
    691
    50C is regular temp in load imo, im running very hot (only 3 fans + no cpu fan) im getting while playing bf3 72C @ 3.8Ghz turbo mode, but its only game during turbomode it gets that hot tho...

  17. #17
    Immortal Evolixe's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    In the Shadows
    Posts
    7,364
    Quote Originally Posted by DarkXale View Post
    Actually it does. The GPU needs instructions on what its supposed to render, which the CPU provides. In fact, I fired up The Witcher 2 - and nabbed its thread data.
    *image*
    You'll note there are 3 threads under noticable use. 2 game, and 1 for the GPU driver. That last one is responsible for feeding the GPU, and as you can see, its not exactly under light load.

    Overall, between a quarter to a third of the game's CPU usage is for communication with the GPU.

    Reducing the screen resolution (in my case from 2560x1440) will increase the CPU load of all threads, (Framerate of course improving considerably) and with usages over 12%. To put that in perspective, the GPU needs more than half of a stock 3570K/2500K core just for instructions.
    That ís interesting. Though can you explain how my 2-year old notebook i5-400m runs wow just fine on 900p then? Since it clocks at ~2.4GHz.

    ---------- Post added 2012-07-22 at 08:57 PM ----------

    Quote Originally Posted by gamingmonitor View Post
    Overclocking poor cpu's will give benefit to a games fps, sure, but poor cpu's tend to not overclock very well either.
    Honestly, the biggest reason i want to overclock my CPU is because i want to run heavy programs ALONG with games without having any impact on the game itself.
    So i guess that could be a reason, for anyone.
    Last edited by Evolixe; 2012-07-22 at 08:58 PM.

  18. #18
    The Lightbringer inux94's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Nuuk, Greenland
    Posts
    3,352
    900p? At what settings? Medium?


    Quote Originally Posted by gamingmonitor View Post
    Pretty much correct.

    Loss of fps will happen when a part of the system bottlenecks. You even get fps dips when your hard drive is bottlenecking and is loading alot of items.

    Fact of the matter is with cpu's, if you have a decent cpu such as an i5 or i7, there are very few moments where it will ever be a bottleneck in games.


    Overclocking poor cpu's will give benefit to a games fps, sure, but poor cpu's tend to not overclock very well either.

    Overclocking good cpu's wont benefit to a games fps, even though they can be overclocked very well, because even at stock speeds they arent even close to bottlenecking your system.
    There's no difference between i5 & i7 in terms of gaming, unless you go up to the LGA 2011, but that's a waste of money when you only go gaming since the performance gain is little.
    Last edited by inux94; 2012-07-22 at 10:32 PM.
    i7-6700k 4.2GHz | Gigabyte GTX 980 | 16GB Kingston HyperX | Intel 750 Series SSD 400GB | Corsair H100i | Noctua IndustialPPC
    ASUS PB298Q 4K | 2x QNIX QH2710 | CM Storm Rapid w/ Reds | Zowie AM | Schiit Stack w/ Sennheiser HD8/Antlion Modmic

    Armory

  19. #19
    Immortal Evolixe's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    In the Shadows
    Posts
    7,364
    Quote Originally Posted by inux94 View Post
    900p? At what settings? Medium?
    Nah, ultra.

    Given, it snaps under the load of a heavyly crowded area. In cities i dip to 25-30fps and in TB/40man BG's i can go as low as 15.

    However, in a quiet envirionment (Ala Arena/10-15man BG) i easly make it to keep a steady 60fps..
    Last edited by Evolixe; 2012-07-23 at 12:26 AM.

  20. #20
    Deleted
    Quote Originally Posted by Holo View Post
    That ís interesting. Though can you explain how my 2-year old notebook i5-400m runs wow just fine on 900p then? Since it clocks at ~2.4GHz.
    Because outside of raid conditions, WoW simply isn't that demanding. Theres not much calculations the CPU needs to perform, and the GPU feeding thread is not very loaded due to the limited game graphics, which also means the GPU is not under stress either.

    In a raid, the CPU is more often choked. Typically, it can't compute events in the game fast enough - meaning nothing to feed to the GPU - or inadequate resources to do so - and therefore it is incapable of drawing a new frame - hence 'FPS loss'.

    Remember, games often "update" every time a new frame is drawn, and the longer this update takes to perform, the longer until the GPU can be given new rendering instructions, which themselves must also be managed first by the CPU.

    -> Game Update -> CPU-to-GPU -> GPU load ->

    In GPU intensive games (FPS arena games for example), there just isn't much to actually do during the 'game update' phase so its over extremely quickly. But the GPU is given very complex instructions, so the computer spends the vast majority of time working out the frame there. Hence a good graphics card improves FPS more than upgrading the CPU.

    But if little time is spend in the GPU load stage, like in WoW and many RTS games, then a good CPU is required for a high framerate.

    This is also why you'll typically experience reduced CPU utilization with increased GPU workloads, and vice versa.
    Last edited by mmoca371db5304; 2012-07-23 at 01:59 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •