Any apparently framerate increase is likely due to either:
- Your system having a bottleneck
- The game being CPU intensive
- The fact that the CPU is multicore and the game being singularly threaded
Ultimately, if you make the right processor choice, it isn't an issue - atleast for gaming. And if you consider the standard person/gamer, limited in their knowledge of such things, voiding their warranty over FPS greed is not something I would advise. If we look at a typical scenario where an i5 would struggle at load (there is one? O_O), I dunno, you are getting 40fps (playable) at average and the grunt is done on the GPU, about 30% of the actual FPS is affected from the CPU being under pressure (i.e. there is much more concern for GPU lag than CPU lag as the CPU processes a lot less). When I say lags of course I don't mean full computer usage, that would be complete system crash in a literal sense, and system grinding to a halt in a more practical sense (when the OS realises it has bitten on more than it can chew and queues tasks).
So if we consider 30% to be a realistic percentage of CPU processing within games, a 40-60% increase of 30% of an average FPS of 40, is 6 assuming a median of the range.
You state a 40-60% complete increase in FPS but I would contest the impossibility of that without a serious system bottleneck as I know that could only be possible in a game where the CPU is handling most of the processing. I am not completely informed on CPU architecture, but I do know that there are different platforms in programmed code:
While I have never investigated personally I would believe there to be a heirarchical and tree priority in place as is controlled by the OS, the graphics API and the application.
Operating System has the highest priority and thats where the core process runs, consider that the "engine platform" for the game. That acts like a library of functions that can be accessed by the CPU and the game itself. So the engine is like "CryEngine" "Unreal Engine" or WoW's inhouse engine, it's not the gamecode itself but it's the platform on what the game runs. This game platform communicates with the operating system and the graphics API within that operating system (DirectX or OpenGL would be the candidates in today's world, unless it is some commercial CAD/CAM/rendering). The graphics API is loaded into the gaming platform so that its library of specific functions are available, and those specific functions make calls to the GPU that renders immediately to the monitor. All of the graphics will be sent this way - and sometimes even physics too on Cuda or PhysX cards.
On the same level as the graphics API is the actual game code - the physics, the events, the procedures and scripted happenings of the game. Likely hard for a regular gamer to visualise it's an NPC, your game character and it's items (as pure code), it's your falling down to earth (but not visually). In a game like WoW, it can be a lot of code, on your OS, it is a lot of code, on a game like Crysis, HL2 etc there is a lot of physics coding. But in comparison to what has to be sent through the graphics API it is the minority.
There will likely be something just before - and inbetween the graphics part of the game platform and the game part of the game platform that is a "tick", something to keep them both in sync so that you don't fall while your character appears to be flying. Anyway with this information you can see that unless your CPU is unable to keep up with your GPU, (to stay in sync) it is unlikely that overclocking your CPU would provide a great increase to FPS (in comparison to a GPU overclock).
Essentially the issue with having an i5 is that it's per core clock is not great for single threaded applications. In a true multi threaded application there would be no bottleneck, no reason to overclock as the gain would likely not even be seen.
I don't know what i5 you have but I assume it was a <3.0ghz stock version. If it was greater than 3.0 then it was a serious system bottleneck as a 3.0ghz i5 would handle practically the market of games available without giving a substantial increase through overclocking without a seriously CPU intensive game like Cryostasis.
((( There are other variables, such as your background applications, what you are multi tasking, what OS you are on etc but essentially what I said above holds true )))
I don't disagree that you got a performance increase, but the increase would have been negligible past the point where the CPU can handle the processes coming at it. So unless your CPU is originally under strain it will get no benefit from the overclock. The higher your FPS is in the game, the more likely that the CPU overclock will not help (except perhaps in situations where it is a massive CPU overclock of 1ghz and your FSB is linked and synced with your RAM). So there was an inherent problem with your system that made your overclock so significant.
tl;dr:
What I posted above is not strictly true in all situations but it is what I have encountered while overclocking many generations of CPU's in the past 8 years. CPU overclocks are beneficial but are not as significant as they are made out to be nowadays. Yesteryear they could be the difference between running Counterstrike and not running Counterstrike, nowadays its like running L4D2 at 120 fps or running it at 125fps.
(GPU overclocks are still VERY significant - but I would always advise buying a pre overclocked GPU unless you are a dedicated overclocker or have a lot of money as GPU overclocks seriously lower the life of the GPU and make it more prone to failure, and a personal overclock voids warranty).