Page 3 of 3 FirstFirst
1
2
3
  1. #41
    Quote Originally Posted by Vegas82 View Post
    The resolution difference is probably impacting it a bit more than the gpu.
    I agree, was just posting results is all!

  2. #42
    Deleted
    Reading this thread, just made me realize a lot of you have no idea how CPU/GPU's work when it comes to gaming.

    1. Cores matter, but not a lot, the speed of those cores are the direct multiplier of your performance, the higher the clock the better it will run a game.
    (A 12 core cpu will always run games worse than a 6 core cpu, the main reason for that is because the 6 core have baseline higher clocks before OC).

    2. Resolution matters, like a guy here said I have worse performance on a 1080p vs 1440p, well duh, it's 2073600 vs 3686400 pixels, it's fairly logical that an increase in pixels. It's the reason why a lot of video cards can't deliver 144fps on a 4K monitor, a hint, it's mainly because of the pixelcount.

    Please learn the basics.

  3. #43
    Quote Originally Posted by CryotriX View Post
    People are just too emotionally invested in what they like these days, PC hardware included, consoles, phones, it never ends. Personally, I deeply despise all big business, they are all immoral, as they only exist to create more profit, which is an immoral thing in itself. I find it hilarious that yesterday in another forum I was jumped on because I recommended to a dude looking to upgrade from a 4790K to 8700K a 2700x instead, especially as he was planning to delid and buy expensive cooling, while you can use the 2700x out of the box with the stock cooler and save lots and lots of money. He was not a WoW gamer though.

    In short, everyone should let go already of this fanboysm and adulation for huge companies that don't give a rat's ass about you or I. Learn to put emotions aside and just buy what's the best deal for you. Ditch all forms of "brand loyalty", it's rotting your brains.

    Each and every customer should be loyal to their MONEY, not to a brand.

    EDIT:

    Don't recommend the 9600K unless you do it to an enemy. Gamer's Nexus showed how flawed the frametimes are on the 6 core i5s in games like Far Cry. You want more cores. I'd not buy any CPU from Intel other than 8700K if you find it reasonably priced. AMD has better offers for pretty much 80% of use cases. You want those 12-16 threads, yeah, WoW is dumb and can't use them, but other games can, and why in the world would you limit yourself to just WoW. Might as well go with 8350K or 8600K if WoW's all you do, but I'd really avoid CPUs without SMT/HT.


    Sorry dude but you are way off base here, an overclocked 9600k will beat an overclocked 2700x in pretty much every game on the market. We are sooo far from games making actual usage of CPU's past 4c8t or 6c6c by the time we are at a point where we are a whole new range of CPU's will be out from both companies. In fact what you are doing is a far greater blunder, you are on a wow forum suggesting people not buy a cpu that would play that game much better than the competition.

  4. #44
    Scarab Lord Wries's Avatar
    10+ Year Old Account
    Join Date
    Jul 2009
    Location
    Stockholm, Sweden
    Posts
    4,127
    I've edited my original post some. For the last PTR build as well as the live game, I can't replicate my findings. Which is great as setting manual CPU affinity sucks. I still do it for splitting cores into encoding/game-rendering when I stream, hence why I discovered the anomaly. But I don't want every person playing having the general recommendeation for them to play with a CPU thread cvar or something similar.

    I even saw a tiny bit of a performance uplift when allowing 6 cores instead of 4 now And allowing SMT ("hyperthreaded cores") on/off seemed to have no effect, thankfully.

  5. #45
    Those guys exist to cater to the weirdos you described earlier in this thread, the people who buy computer parts based on brand loyalty or the competitions "deceptive practices". This is why i dont watch channels like that, they will do anything in their power to make intel or nvidia look like the bad guys, because it gets CLICKS.

    What you need to understand my guy is for the vast majority of people who are normal desktop PC users, a 9600k or 8600k is a better choice than a 2700x. I list the i5's because they are more price comparable to a 2700x than an i7 is. We are just now STARTING to see games use up to 6 cores and hardly any use cores past that, the overwhelming vast majority of games on the market (you know good games that people actually play) are still limited to 4c/8t at max. If we ever get to a point where 8c16t cpu's are being taken full advantage of by a majority of new release games on the market both intel and AMD will have cpu's out in the 8ghz range, that is how long that is going to take.

  6. #46
    Quote Originally Posted by CryotriX View Post
    GN's channel is really not that type, you should consider giving it a chance. He even refuses advertisers that he deems shady, like that SCDKeys crap a lot of tech youtubers advertise. This channel trashes AMD, Intel, Nvidia, without any preference Pretty sure it even affected his business, as for example other tech Youtubers like Linus were sent Titan RTX, GN was not. He also had to buy stuff for testing as companies he trashed would not send samples.

    I know it's tempting to consider everyone a shill, but my suggestion is treat each case separately.
    I know who they are and why they exist, look no farther than this video to see the bias:
    https://www.youtube.com/watch?v=GDggr3kt96Q

    Now go look at the streaming settings they are using and you will start to understand why these guys exist, to make intel + nvidia look bad to unaware customers. First off they are using youtube streaming as an example which lets be honest, no one watches youtube streams. Second they are using a preset that is not only unrealistic, but one that 99% of users shouldnt be touching, no matter their hardware. Veryfast on OBS is the most optimized preset and is the one the vast majority of people should use, gamers nexus ONLY chose the faster setting to show a difference between a 12t cpu and a 6t one regardless if it was realistic or not.

    My entire point here dude is for normal PC users intel is going to be the better choice in 9/10 scenarios, and that 1/10 is most likely going to be for someone who renders a lot of videos to upload you youtube. Its good that we finally got an upshoot on cores for mainstream CPU's and if you can afford the i7's i definitely recommend them, but dont try and convince people that i5's are somehow bad because the landscape of consumer oriented software is still optimized around 4c cpu's.

  7. #47
    Moderator Cilraaz's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    PA, USA
    Posts
    10,139
    Well, I checked my raid FPS last night. It was oddly the same as pre-patch. I later realized it's because my GPU OC software had closed and my GPU had reverted to base clocks. So, I picked up some data from tonight's raid on my Rogue. I ran a 90-second capture during fairly intensive periods of Zek'voz (p2 with adds up) and Zul (at the start, using the burn strat). Tests were run with a preset of 7/10, 2xMSAA (with MFAA enabled), and at 1080p resolution.

    Average FPS looks good on both fights, especially Zek'voz, but the frame times were super variable. It didn't feel as bad as it looks, though.

    Zek'voz:
    Average FPS - 80.2
    Average frame times - 12.47ms
    Low 1% FPS - 44.5
    Low 1% frame time - 22.45ms



    Zul:
    Average FPS - 58.66
    Average frame time - 17.05ms
    Low 1% FPS - 33.49
    Low 1% frame time - 29.86ms


  8. #48
    Moderator Cilraaz's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    PA, USA
    Posts
    10,139
    Quote Originally Posted by CryotriX View Post
    Considering this is WoW, it's not that bad at all, although the 50ms spikes should be easily noticeable. If you ran monitoring apps (like MSI AB), it could have made things worse. The power sensor is typically taking a lot of CPU time, it was a named bug in previous drivers (Using power monitoring in GPU monitor tools causes micro stutter. [2110289/2049879]), supposed to be fixed but... my MSI AB still shows the power monitoring taking just as much CPU time as it used to. On my PC, it never resulted in stutters unless I played with MSIAB refresh intervals, but it did affect other users. To check, you can right click the "Detach" monitoring pane on MSI AB, select "show profiler panel".

    Other game engines also microstutter, WoW is doing honorably in those graphs. Very few games have actually optimized engines. Overwatch is one, the engine is pretty much flawless. Shadow of the Tomb Raider and Hitman 2 also do quite well. Destiny 2 also, but there you cannot monitor it, so you're just left with feelings and a FPS counter.
    I was running MSI AB with the OSD during raid tonight. The capture was being done via OCAT, though. When I tested OCAT for CPU impact, it never even registered CPU usage in Task Manager. I'll pull an OCAT capture with MSI AB closed tomorrow night. We'll see if that improves anything.

  9. #49
    Has anyone tried WoW with an older i5? Still waiting on the ram for my spare pc to try it out.

  10. #50
    Moderator Cilraaz's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    PA, USA
    Posts
    10,139
    Quote Originally Posted by CryotriX View Post
    Considering this is WoW, it's not that bad at all, although the 50ms spikes should be easily noticeable. If you ran monitoring apps (like MSI AB), it could have made things worse.
    ... I am an idiot. Last night I was thinking about whether MSI AB could have been impacting frame times. I was freakin' streaming at the time of those captures. I have a feeling OBS using 20-25% CPU might have a bit more impact than the Afterburner/RTSS OSD. I feel dumb, lol.

  11. #51
    Moderator Cilraaz's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    PA, USA
    Posts
    10,139
    Well, I finally got around to grabbing more data. I made sure MSI AB wasn't tracking power this time. I only thought to grab data from the first 4 bosses tonight, though.

    FPS listing are in, obviously, frames per second. Frame times are in milliseconds.

    Taloc:
    Avg FPS - 69.29
    Avg Frame Time - 14.43
    Low 1% FPS - 21.39
    Low 1% Frame Time - 46.74



    Mother:
    Avg FPS - 58.39
    Avg Frame Time - 17.13
    Low 1% FPS - 20.02
    Low 1% Frame Time - 49.95



    Fetid:
    Avg FPS - 77.53
    Avg Frame Time - 12.90
    Low 1% FPS - 25.49
    Low 1% Frame Time - 39.31



    Vectis:
    Avg FPS - 74.31
    Avg Frame Time - 13.46
    Low 1% FPS - 25.09
    Low 1% Frame Time - 39.85


  12. #52
    Quote Originally Posted by CryotriX View Post
    I've quit WoW more than a year ago, and proudly not coming back. I am testing and looking into stuff like MT optimization purely for my technical curiosity and well, because I take great pleasure in trashing Blizzard for their bad engine. For me, WoW is but one (badly unoptimized) game in a sea of other games.

    However, I am not wrong on the 9600K. Data is actually on my side.

    Look at this insanely horrible frametime performance below. You really wanna spend on that? If your answer is yes, then go for it, but you spend on trash. It's not just this game either.

    Also, I didn't say not to buy the CPU for just WoW. But I do think you're making the wrong choices if you buy a PC specifically for an old MMO that looks to be on its last breath.

    By the way, the CPU is OK in most games. But I'd say FC5 is showing how future games that eat those 6 threads will perform. I tend to estimate components on their ability to last at least 3-5 years. With Ryzen putting up a good fight and core counts increasing, I don't think the 6 core i5s will enjoy the same lifetime as 4 cores did.



    I honestly wonder if they got a bad chip, because even they'll tell you that the 9600k is just a soldered 8600k so they should really perform similar, with better temps on the 9600k (not surprisingly).

  13. #53
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Onikaroshi View Post
    I honestly wonder if they got a bad chip, because even they'll tell you that the 9600k is just a soldered 8600k so they should really perform similar, with better temps on the 9600k (not surprisingly).
    The 8600K is just as bad though if you look at the 1% and 0.1% lows.

  14. #54
    The lack of hyper threading makes the 9600k a no go honestly.

    We are in an 8 core world now, consoles will be using ZEN2 8 core chiplets, and you can get desktop 8 cores super cheap.

    If it had HT then it would be ok for a good number of years, but 1 or 2 years down the line the lack of cores is going to show on that 9600k .... hell its showing now. 8core is the new quad core, 6 core no HT will go down in history like the old 3 core AMDs
    Power corrupts, unlimited power... is even more fun!

  15. #55
    Quote Originally Posted by Shakadam View Post
    The 8600K is just as bad though if you look at the 1% and 0.1% lows.
    True, personally love my 9600k, but like the guy said, right now its only showing up in FC5 for the most part AND i plan on moving out of 1080p soon anyway which puts more stress on the CPU then 1440p. I've overkilled 1080p at this point....

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •