Reading this thread, just made me realize a lot of you have no idea how CPU/GPU's work when it comes to gaming.
1. Cores matter, but not a lot, the speed of those cores are the direct multiplier of your performance, the higher the clock the better it will run a game.
(A 12 core cpu will always run games worse than a 6 core cpu, the main reason for that is because the 6 core have baseline higher clocks before OC).
2. Resolution matters, like a guy here said I have worse performance on a 1080p vs 1440p, well duh, it's 2073600 vs 3686400 pixels, it's fairly logical that an increase in pixels. It's the reason why a lot of video cards can't deliver 144fps on a 4K monitor, a hint, it's mainly because of the pixelcount.
Please learn the basics.
Sorry dude but you are way off base here, an overclocked 9600k will beat an overclocked 2700x in pretty much every game on the market. We are sooo far from games making actual usage of CPU's past 4c8t or 6c6c by the time we are at a point where we are a whole new range of CPU's will be out from both companies. In fact what you are doing is a far greater blunder, you are on a wow forum suggesting people not buy a cpu that would play that game much better than the competition.
I've edited my original post some. For the last PTR build as well as the live game, I can't replicate my findings. Which is great as setting manual CPU affinity sucks. I still do it for splitting cores into encoding/game-rendering when I stream, hence why I discovered the anomaly. But I don't want every person playing having the general recommendeation for them to play with a CPU thread cvar or something similar.
I even saw a tiny bit of a performance uplift when allowing 6 cores instead of 4 now And allowing SMT ("hyperthreaded cores") on/off seemed to have no effect, thankfully.
Those guys exist to cater to the weirdos you described earlier in this thread, the people who buy computer parts based on brand loyalty or the competitions "deceptive practices". This is why i dont watch channels like that, they will do anything in their power to make intel or nvidia look like the bad guys, because it gets CLICKS.
What you need to understand my guy is for the vast majority of people who are normal desktop PC users, a 9600k or 8600k is a better choice than a 2700x. I list the i5's because they are more price comparable to a 2700x than an i7 is. We are just now STARTING to see games use up to 6 cores and hardly any use cores past that, the overwhelming vast majority of games on the market (you know good games that people actually play) are still limited to 4c/8t at max. If we ever get to a point where 8c16t cpu's are being taken full advantage of by a majority of new release games on the market both intel and AMD will have cpu's out in the 8ghz range, that is how long that is going to take.
I know who they are and why they exist, look no farther than this video to see the bias:
https://www.youtube.com/watch?v=GDggr3kt96Q
Now go look at the streaming settings they are using and you will start to understand why these guys exist, to make intel + nvidia look bad to unaware customers. First off they are using youtube streaming as an example which lets be honest, no one watches youtube streams. Second they are using a preset that is not only unrealistic, but one that 99% of users shouldnt be touching, no matter their hardware. Veryfast on OBS is the most optimized preset and is the one the vast majority of people should use, gamers nexus ONLY chose the faster setting to show a difference between a 12t cpu and a 6t one regardless if it was realistic or not.
My entire point here dude is for normal PC users intel is going to be the better choice in 9/10 scenarios, and that 1/10 is most likely going to be for someone who renders a lot of videos to upload you youtube. Its good that we finally got an upshoot on cores for mainstream CPU's and if you can afford the i7's i definitely recommend them, but dont try and convince people that i5's are somehow bad because the landscape of consumer oriented software is still optimized around 4c cpu's.
Well, I checked my raid FPS last night. It was oddly the same as pre-patch. I later realized it's because my GPU OC software had closed and my GPU had reverted to base clocks. So, I picked up some data from tonight's raid on my Rogue. I ran a 90-second capture during fairly intensive periods of Zek'voz (p2 with adds up) and Zul (at the start, using the burn strat). Tests were run with a preset of 7/10, 2xMSAA (with MFAA enabled), and at 1080p resolution.
Average FPS looks good on both fights, especially Zek'voz, but the frame times were super variable. It didn't feel as bad as it looks, though.
Zek'voz:
Average FPS - 80.2
Average frame times - 12.47ms
Low 1% FPS - 44.5
Low 1% frame time - 22.45ms
Zul:
Average FPS - 58.66
Average frame time - 17.05ms
Low 1% FPS - 33.49
Low 1% frame time - 29.86ms
I was running MSI AB with the OSD during raid tonight. The capture was being done via OCAT, though. When I tested OCAT for CPU impact, it never even registered CPU usage in Task Manager. I'll pull an OCAT capture with MSI AB closed tomorrow night. We'll see if that improves anything.
Has anyone tried WoW with an older i5? Still waiting on the ram for my spare pc to try it out.
... I am an idiot. Last night I was thinking about whether MSI AB could have been impacting frame times. I was freakin' streaming at the time of those captures. I have a feeling OBS using 20-25% CPU might have a bit more impact than the Afterburner/RTSS OSD. I feel dumb, lol.
Well, I finally got around to grabbing more data. I made sure MSI AB wasn't tracking power this time. I only thought to grab data from the first 4 bosses tonight, though.
FPS listing are in, obviously, frames per second. Frame times are in milliseconds.
Taloc:
Avg FPS - 69.29
Avg Frame Time - 14.43
Low 1% FPS - 21.39
Low 1% Frame Time - 46.74
Mother:
Avg FPS - 58.39
Avg Frame Time - 17.13
Low 1% FPS - 20.02
Low 1% Frame Time - 49.95
Fetid:
Avg FPS - 77.53
Avg Frame Time - 12.90
Low 1% FPS - 25.49
Low 1% Frame Time - 39.31
Vectis:
Avg FPS - 74.31
Avg Frame Time - 13.46
Low 1% FPS - 25.09
Low 1% Frame Time - 39.85
The lack of hyper threading makes the 9600k a no go honestly.
We are in an 8 core world now, consoles will be using ZEN2 8 core chiplets, and you can get desktop 8 cores super cheap.
If it had HT then it would be ok for a good number of years, but 1 or 2 years down the line the lack of cores is going to show on that 9600k .... hell its showing now. 8core is the new quad core, 6 core no HT will go down in history like the old 3 core AMDs
Power corrupts, unlimited power... is even more fun!