Page 1 of 2
1
2
LastLast
  1. #1

    Upgrade from 5930k/gtx 1070 worth it or nah?

    Basically title.

    Ever since I upgraded to 1440p 144hz I've had to lower the settings CONSIDERABLY, a lot of low/medium settings just so i can get decent over 60fps on fps heavy things like raiding/bg's etc.
    My 5930k is OC'd to 4.5ghz.
    Im not talking about world quests or any other number of things that u do solo, u can easily get fps on those.

    So im thinking, is it worth upgrading to 9700k/9900k and/or something like a 2080 super. CPU upgrade or GPU upgrade or both? Would it make any difference in raids/bg's etc?

    I know what everyone says: wow is old, its CPU heavy bla bla, not properly optimised yada yada yada...

    Will i see any difference from upgrading my GPU? Or CPU? Or both?



    P.S.
    I only play wow, dont rly care about performance in other games.
    Not planning on playing other games any time soon.

  2. #2
    You definitely don't need a 2080S to get 144 fps in raids/bgs with your 1440p, but you do need something better than a 1070. A 2070 or RX 5700 XT should do the job. I would consider upgrading your CPU as well. 3rd gen AMD CPUs were recently released. A 3700x would future proof you for a while.

  3. #3
    I have a 6700k @4.6 / gtx1070 and it runs just fine at 4k; raids, m+, averaging 60-70 fps in raids usually.

    What brand of GPU do you have? And what kind of FPS?

  4. #4
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by itsasharkdude View Post
    Basically title.

    Ever since I upgraded to 1440p 144hz I've had to lower the settings CONSIDERABLY, a lot of low/medium settings just so i can get decent over 60fps on fps heavy things like raiding/bg's etc.
    That's WoW, not your hardware.

  5. #5
    Banned Strawberry's Avatar
    15+ Year Old Account
    Join Date
    Jul 2007
    Location
    Sweden/Yugoslavia
    Posts
    3,752
    Terrible time to upgrade.
    Wait for AMD next-gen CPU and Nvidia/AMD next-gen GPUs, shouldn't take more than 6 months for them to release. Don't support the overpriced RTX crap.

  6. #6
    Your CPU is fine. I have a 5820k with a 1080ti and can run 1440p at 144hz with no problems. Even with the 1070 you should be fine. I would at other optimizations before considering any hardware upgrades.
    CPU: Intel i9-14900k | MOBO: ASUS RoG Strix Z790-F
    GPU: ASUS Strix 4070 TI Super | RAM: Corsair Vengeance LED DDR5 32GB
    SSD: Samsung 970 EVO NVMe 1TBx3 | PSU: Corsair HX 1200i Platinum | CASE: Lian Li 011 EVO
    DISPLAY: Triple 27" Monitors | ASUS Swift ROG PG2780 27" 1440p - 2x BenQ 27" IPS 1440p

  7. #7
    I was reading gpu reviews the other day and I came accross this, its not worth the upgrade from a 1070 to the 2xxx series.
    Maybe a 2080ti but not a 1070 to a 2080 or less.


    https://www.gamingscan.com/are-nvidi...ards-worth-it/

  8. #8
    Quote Originally Posted by Phuongvi View Post
    I was reading gpu reviews the other day and I came accross this, its not worth the upgrade from a 1070 to the 2xxx series.
    Maybe a 2080ti but not a 1070 to a 2080 or less.

    Thats what i kept thinking.
    It was nagging at me in the back of my head, 2080 seems ok but kinda pointless when compared to 2080ti.
    Its just not worth it money wise.

    Quote Originally Posted by Strawberry View Post
    Terrible time to upgrade.
    Wait for AMD next-gen CPU and Nvidia/AMD next-gen GPUs, shouldn't take more than 6 months for them to release. Don't support the overpriced RTX crap.

    Ill probably wait for 10900k, upgrade the CPU/faster ram and take it from there, hopefully that will be enough and i wont have to touch the GPU.

  9. #9
    Banned Strawberry's Avatar
    15+ Year Old Account
    Join Date
    Jul 2007
    Location
    Sweden/Yugoslavia
    Posts
    3,752
    I think you'll gain more performance by upgrading the GPU. For the amount of money you'd spend on 10900k, I'd buy 1080Ti or 2070S.
    I have 9900k @ 5ghz and 1080Ti and I'm getting around 100-120 fps on 4k with everything maxed out.

  10. #10
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Strawberry View Post
    I think you'll gain more performance by upgrading the GPU. For the amount of money you'd spend on 10900k, I'd buy 1080Ti or 2070S.
    I have 9900k @ 5ghz and 1080Ti and I'm getting around 100-120 fps on 4k with everything maxed out.
    But WoW is among the least GPU and most CPU demanding games.. And that's ignoring that, no, you aren't getting that amount of FPS in raids. You might be getting it out in the overworld, but I was getting that same FPS at 1440p, with the same hardware (9700k instead of 9900k, but still).

    WoW's engine is fucking dogshit. Like, absolutely awful. Hardware isn't the problem here. The engine is.

    - - - Updated - - -

    Quote Originally Posted by itsasharkdude View Post
    Thats what i kept thinking.
    It was nagging at me in the back of my head, 2080 seems ok but kinda pointless when compared to 2080ti.
    Its just not worth it money wise.
    A 2080super is maybe like, 5% worse? And costs 60% of the price.

    The 2080ti is super overpriced.

  11. #11
    Would love to see some evidence for some of these claims, I see no way in hell that you are running 4k with 60+ fps in wow raids, especially the 4k 100-120 fps at max settings.

    I've recently played wow @ 1440p 144hz on a 9700k + 2070 and the fps is very bad still in raiding scenarios. (this was ultra 7) and on M Ashvane i would get around 50 fps during intense parts (first bubbles bursting = 50fps and the highest was around 75fps during p2 with boss in middle), the 9700k was at 4.8ghz.
    It's possible to be getting 100+ fps without actual combat sure... I get the 140 fps when we're setting up pulls but in actual raid combat you are lucky to be anywhere near 100 fps at decent settings.

  12. #12
    I would wait for next gen CPU/GPU's if you want value for money, the 1070 is still a very capable card (in regards to WoW) and your processor is still very decent and if you sit tight you will get to see Intel respond to being completely shat on by AMD, thus reaping the rewards. If you're desperate to bump your performance at 1440p for some settings then it's your money, but personally I would wait, when you boil it down the settings you sacrifice to maintain good FPS with a 1070 at 1440p don't make a lot of difference aesthetically, and due to WoW being shit and old it's a lost cause trying to get really high FPS while raiding regardless of your system.

    2000 series GPU's only just introduced raytracing too, the next gen will likely offer a big leap in that area combined with traditional performance gains, I'd wait (and am waiting).
    Probably running on a Pentium 4

  13. #13
    Banned Strawberry's Avatar
    15+ Year Old Account
    Join Date
    Jul 2007
    Location
    Sweden/Yugoslavia
    Posts
    3,752
    Quote Originally Posted by Temp name View Post
    But WoW is among the least GPU and most CPU demanding games.. And that's ignoring that, no, you aren't getting that amount of FPS in raids. You might be getting it out in the overworld, but I was getting that same FPS at 1440p, with the same hardware (9700k instead of 9900k, but still).

    WoW's engine is fucking dogshit. Like, absolutely awful. Hardware isn't the problem here. The engine is.

    - - - Updated - - -


    A 2080super is maybe like, 5% worse? And costs 60% of the price.

    The 2080ti is super overpriced.
    You're right, I don't raid. But I was getting 80-110 fps in Boralus. I don't have WoW installed right now, so I can't back those claims, but yeah, I'm still fairly sure he'll get more performance out of 1080Ti than 9900k (or the next gen).

    And we 're also talking 4k vs 1440p and there's a huge performance demand going from 1440p to 4k.

  14. #14
    Quote Originally Posted by Temp name View Post
    WoW's engine is fucking dogshit. Like, absolutely awful. Hardware isn't the problem here. The engine is.
    I realize that you and I are normally on the same side of a topic here, but can you please, for the sake of Baby Yoda, stop fucking spreading this insane fucking bullshit?

    The engine is not "bad".

    It is a limitation of ANY secure client-server game.

    ANY.

    Not just WoW.

    PUBG has shit performance when tons of people are in the same area (the Starting area with 60+ people often drops to low double digits on even the most powerful rigs).

    FF14 has tiny party sizes and heavy phasing to compensate. The big raids (when they finally ditched PS3 support) have performance similar to WoW, for the same reasons WoW has that performance.

    ESO uses MASSIVE phasing. ANd in the few situations where there are lots of players on the screen? Bad performance.

    GW2.... go try RvRvR just once, let me know how that works out for you.

    Its not the engine. Its the limitations of the type of game it is.

    If you dont force client/server parity, you open up to rampant cheating (see: Fortnite; performs a lot better than PUBG (in part due to art style > complexity/realism choices), but is FAR easier to hack or cheat) that really cant even be policed.

    Pick your poison.

  15. #15
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Strawberry View Post
    And we 're also talking 4k vs 1440p and there's a huge performance demand going from 1440p to 4k.
    So.. Then why am I getting the same performance as you?

    - - - Updated - - -

    Quote Originally Posted by Kagthul View Post
    I realize that you and I are normally on the same side of a topic here, but can you please, for the sake of Baby Yoda, stop fucking spreading this insane fucking bullshit?

    The engine is not "bad".

    It is a limitation of ANY secure client-server game.

    ANY.

    Not just WoW.

    PUBG has shit performance when tons of people are in the same area (the Starting area with 60+ people often drops to low double digits on even the most powerful rigs).

    FF14 has tiny party sizes and heavy phasing to compensate. The big raids (when they finally ditched PS3 support) have performance similar to WoW, for the same reasons WoW has that performance.

    ESO uses MASSIVE phasing. ANd in the few situations where there are lots of players on the screen? Bad performance.

    GW2.... go try RvRvR just once, let me know how that works out for you.

    Its not the engine. Its the limitations of the type of game it is.

    If you dont force client/server parity, you open up to rampant cheating (see: Fortnite; performs a lot better than PUBG (in part due to art style > complexity/realism choices), but is FAR easier to hack or cheat) that really cant even be policed.

    Pick your poison.
    WoW's engine is also really old. It's 15 years old. While yes, forcing parity is part of it, a very large part of it, there's also a lot of bad stuff in the engine. Hell, I can get double the FPS in classic as I do on live

  16. #16
    to the OP:

    the thing limiting your FPS the most is your CPU.

    However... you're not going to get a sizeable upgrade here without a sizeable investment of cash.

    The 5930K @ 4.5ghz isn't a bad CPU; clock-for-clock, its only about ~15% slower than a modern 9000-series chip, and even then those gains dont translate linearly into performance.

    So while you could get a 9700K and OC it to 5.1ghz ost likely.... you're going to get about ~25% more performance, but that wont translate directly into 25% more frames.

    Itll give you ~10-15% more minimum frames (WoWs max framerates not being an issue at all on even much slower rigs; its minimums/consistency that is improved by faster clocks).

    Whether that is worth it to you financially (honestly, even on my rig, an 8600K @ 4.8-5.0ghz (depending on time of year and ambient temps) and a 1080Ti, ill still drop below 60fps sometimes in highly populated raid fights (World Bosses... sometimes down into the 40s during 'Lust, but there's sometimes 60+ players there). Just the nature of the engine.

    If you're after 100-fps-at-all-times-even-during-raids...

    You're not going to find it no matter what you do. You could run an LN2 cooled upcoming 10-series CPU @ 6+ghz and itd still gutter.

    Unless you're dropping into "unplayable" framerates (which varies from person to person, but personally, for an MMO, 30+ fps is playable) i dont think its worth upgrading until you can get Zen 3 (Ryzen 4000) desktop parts or 10nm Intel Desktop Parts (so early 2021 on that).

    As for the GPU - a 1070 should be able to handle 4k/60 in WoW (not other games, but WoW) with reasonable settings.

    First thing you can do is cut Draw Distance, etc to setting 7. You will see virtually no quality difference in the visuals (if you see one at all, on my rig, it just makes some VERY distant mountains visible as shadows/outlines - and by very distant i mean, stuff in Vol'dun from Dazar'alor) and performance will go up immediately.

    Turn Shadows down one notch too. While there is a noticeable difference, i dont really need to see the extra shadows cast by multiple point sources (which is all it adds) - especially not for the massive performance hit it adds (each shadow is an additional set of draw calls!).

    I'd wait to upgrade your GPU for either nVidia's upcoming 3000/1700 series, or Big Navi (cards above the 5700XT). Those should both be out by Q2 this year (even if just barely).

    Quote Originally Posted by Temp name

    WoW's engine is also really old. It's 15 years old. While yes, forcing parity is part of it, a very large part of it, there's also a lot of bad stuff in the engine.
    This has been utterly debunked several times. Every part of the engine has been completely re-written at least twice since WoW launched. Former Devs (who have actually seen and worked on the code) have confirmed it numerous times. Also doesn't change the fact that other games, that are FAR newer (ESO, FF14, et al) have the -exact same- issues.


    Hell, I can get double the FPS in classic as I do on live
    I can go quite a bit higher than that, even. If i uncap the framerate (which i dont do because it disables G-sync) i can get well over 400fps at 4k with everything other than Draw Distance maxed (and that at 8) (when plugged into my 4k TV); when i use DSR/Scaling, (as i play on a 1440p monitor), at 5k its still over 350fps.

    It has to do with far less complexity in the models and far less draw calls (on the order of 75% less draw calls). As a warlock, Rain of Fire has about ~12 particles in Classic. On Live, if i respec so i can pick up Rain, its got about 90 particles. Its a world of difference. (Ive been playing Live the last 7 days so i can get the Frostwolf limited time mount and the Deathwing mount).

    But it (Classic) still gutters to a slideshow in large wPvP battles (~100+ players in the same area). The TM/SS scrum, when it happens, im lucky to maintain ~40fps and even then i get up to 2-3 seconds of "lag" (actually waiting on the network threads to process, as opposed to "real" latency).

  17. #17
    OP, if you have money to burn, then go wild with upgrading your rig. But if your wallet would feel the extra $1.5k pulled out of it, I would really suggest waiting for next development cycle of CPUs and GPUs. The way AMD and intel are at each other's throats, you might get some pretty decent performance bumps and monetary savings in the near future. Also, it is not your rig that is giving you trouble in WoW. I might understand the need to upgrade in case you wanted to play some other titles like RDR 2 on max settings and at 4k, but WoW will chew through anything you throw at it and leave you lagging no matter what hardware you have. Simply because each player on your screen increases loads not linearly, but exponentially it seems. There is no sane amount of hardware you can throw at WoW if you want stable 144Hz+ at 1440p. You will have FPS drops below 144 even if you have multiple 2080ti running in SLI mode, coupled with OCed to 6GHz 9700/9900. In fact, a switch from a 1070 to a 2080Ti will likely result in about 20% gain on average in terms of FPS, all else being equal. A CPU upgrade will net similar results. Basically, it is up to you if you want to spend the better part of 2 grands for an overall 50% boost in fps. For me personally that is a bad deal. I would prefer to lower my settings a bit, especially if going down one notch in graphics quality would net similar results for free, without noticeably changing anything.
    PS: BTW, I am running a 3700x, 1070 and 16Gb 3200 RAM. Never had any problems in 25 raids. But world boss fights with 40+ people turn everything into a lag fest pretty consistently. And there was no major correlation between graphics settings and lag that I could determine. It is just as soon as the amount of people exceeds full raid (40) you start lagging even if your settings are on the lower side.
    Last edited by Gaaz; 2020-01-07 at 09:56 PM.

  18. #18
    Quote Originally Posted by Sconners88 View Post
    Absolutely baffling that a game that looks this shit can run this badly.
    Absolutely baffling that people still dont understand that it has nothing to do with how the game looks.

    Its about Draw Calls, nothing else, which, in a secure client-server situation, are almost entirely bound to one thread (because in order for an object to have a draw call issued for it, the client/engine first has to receive information from the server about where it is). The recent multi-core enhancements revolved almost entirely around de-coupling spawned particles from the server (so, when you Rain of Fire, it only needs to receive the info for the center of the effect, and the 90+ spawned particles are then handed off to a separate thread for the draw calls to be processed) but any object whose location is server-dependent (which still includes lighting, since it can change with phasing) is still coupled to the main thread.

    Its almost like this topic has been discussed to death, here, on these forums, with actual developers weighing in.

    Its why FF14 which looks "better" (arguably, hyper-realism and 'high fidelity' ages like shit) performs the same as WoW in the same situations. In the very few situations in FF14 where the game even allows you to see 30-40 players, it gutters into the 40s and 50s too.

    Because it isn't about the GPU.

    Its almost like different game types are different and bottlenecked in different ways.

  19. #19
    Brewmaster MORGATH99's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    SOMEWHERE ONLY SHE KNOWS
    Posts
    1,298
    if you have the money DO IT

  20. #20
    Thanks for all the replies guys.
    Pretty much what i was suspecting.
    Basically i see no point in spending 1k-1.5k if i cant even reach close to sustainable 100 fps in raids/bgs.

    Ill wait for next gen of cpu's/gpu's and reassess then. But knowing wow, it probably wont change much

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •