Page 1 of 2
1
2
LastLast
  1. #1

    Looking to increase my FPS in raids on a budget.

    Anything I can do here?
    I'm a bit disappointed with 20-35fps in raids. My graphics slider is at 7 (recommended). I could bump this down but I would really prefer playing at a decent graphics level. Playing at a resolution of 3440x1440 (ultrawide)

    PC Specs:
    i5 2500k @ 4.2ghz
    8gb RAM
    GTX 970

    I really only have about $250-$300 to spend right now, so that probably really limits me.

    Should I upgrade to an i7 or just tough it out and save for a new rig in a couple years?

    Someone on craigslist in my area has another 970 for sale for $150 if it's worth going SLI (my mobo has 3 PCI-e slots that run at 16x if that even means anything anymore)

  2. #2
    Moderator chazus's Avatar
    10+ Year Old Account
    Join Date
    Nov 2011
    Location
    Las Vegas
    Posts
    17,222
    At that resolution there's honestly not much you can do. The 2500K is starting to get long in the tooth, but upgrading to a 7600K OC'd will at least give you some boost, but not massive.It ought to bump things up to 35-45, I imagine, instead of 20-35.

    An i7 or better gpu probably isn't going to benefit you at all.
    Gaming: Dual Intel Pentium III Coppermine @ 1400mhz + Blue Orb | Asus CUV266-D | GeForce 2 Ti + ZF700-Cu | 1024mb Crucial PC-133 | Whistler Build 2267
    Media: Dual Intel Drake Xeon @ 600mhz | Intel Marlinspike MS440GX | Matrox G440 | 1024mb Crucial PC-133 @ 166mhz | Windows 2000 Pro

    IT'S ALWAYS BEEN WANKERSHIM | Did you mean: Fhqwhgads
    "Three days on a tree. Hardly enough time for a prelude. When it came to visiting agony, the Romans were hobbyists." -Mab

  3. #3
    You will gain 5 FPS maximum with upgrading since you are already at 4.2Ghz

    Lower the god damn graphic settings.

    The graphic sliders in EVERY GAME are a noob trap.

    Anything but Projected Textures and Particle Density settings is not needed when you are raiding, just keep Texture resolution at High so game doesnt look like dog shit, Anisotropic at 16 for smoother lines also, with CMAA, game looks just fine, and you raid with 40 FPS instead of 25.
    Last edited by potis; 2017-07-14 at 01:43 PM.

  4. #4
    Deleted
    Quote Originally Posted by mashanu View Post
    Anything I can do here?
    I'm a bit disappointed with 20-35fps in raids. My graphics slider is at 7 (recommended). I could bump this down but I would really prefer playing at a decent graphics level. Playing at a resolution of 3440x1440 (ultrawide)

    PC Specs:
    i5 2500k @ 4.2ghz
    8gb RAM
    GTX 970

    I really only have about $250-$300 to spend right now, so that probably really limits me.

    Should I upgrade to an i7 or just tough it out and save for a new rig in a couple years?

    Someone on craigslist in my area has another 970 for sale for $150 if it's worth going SLI (my mobo has 3 PCI-e slots that run at 16x if that even means anything anymore)
    I know what you mean.

    I switched to a GTX 1080 before buying my X34A.

    I'm now upgrading to a 7700k from my 3570k. I'm bottlenecked by CPU in games like Witcher, ME:A & Ghost recon wildlands and etc..

  5. #5
    Quote Originally Posted by Atirador View Post
    I know what you mean.

    I switched to a GTX 1080 before buying my X34A.

    I'm now upgrading to a 7700k from my 3570k. I'm bottlenecked by CPU in games like Witcher, ME:A & Ghost recon wildlands and etc..
    CPU is not making much of a difference in those games. Look at the benchmarks:
    https://www.techspot.com/review/1006...rks/page5.html

    Going from a 2500k to a 4690K makes a whopping 4FPS difference. Big whoopdedoo. CPU does not have as much of an effect on gaming as people like to think. The chart below that shows that clock speed makes no difference at all. An i7-4790K performs the same at 2.5GHZ as 4.5GHZ. So it's not even clock speed making the difference. I guess IPC makes some difference, but the difference in IPC from generation to generation is so small it's not worth it. I still say, if you have a 2500K or newer CPU, upgrading will not make much of a noticeable difference at all.

  6. #6
    Moderator chazus's Avatar
    10+ Year Old Account
    Join Date
    Nov 2011
    Location
    Las Vegas
    Posts
    17,222
    Quote Originally Posted by Atirador View Post
    I switched to a GTX 1080 before buying my X34A.

    I'm now upgrading to a 7700k from my 3570k. I'm bottlenecked by CPU in games like Witcher, ME:A & Ghost recon wildlands and etc..
    However that's a completely different situation.

    A 970 or 1080 won't bottleneck on a 2500K with WoW. If it were another game, that would be different.
    Gaming: Dual Intel Pentium III Coppermine @ 1400mhz + Blue Orb | Asus CUV266-D | GeForce 2 Ti + ZF700-Cu | 1024mb Crucial PC-133 | Whistler Build 2267
    Media: Dual Intel Drake Xeon @ 600mhz | Intel Marlinspike MS440GX | Matrox G440 | 1024mb Crucial PC-133 @ 166mhz | Windows 2000 Pro

    IT'S ALWAYS BEEN WANKERSHIM | Did you mean: Fhqwhgads
    "Three days on a tree. Hardly enough time for a prelude. When it came to visiting agony, the Romans were hobbyists." -Mab

  7. #7
    Deleted
    Quote Originally Posted by chazus View Post
    However that's a completely different situation.

    A 970 or 1080 won't bottleneck on a 2500K with WoW. If it were another game, that would be different.
    Yeah I didn't meant WoW I upgraded for other main games.

  8. #8
    Quote Originally Posted by Atirador View Post
    Yeah I didn't meant WoW I upgraded for other main games.
    and I linked benchmarks showing that in one of those games, the CPU still does not really make much of a difference. In that particular game, OCing also makes zero difference.

    Most of the time, people make an upgrade like their CPU though and they do notice a difference. In my experience, this comes from one of two things.

    One, usually, with a new CPU/MoBo/RAM you do a clean install of Windows, oftentimes to a newer version of Windows. A clean install itself can give an FPS boost, if it's a newer more efficient version of Windows, it's even more(f.ex. going from Vista to 10).

    Two is that people expect an increase, so they see an increase, even if it's not there. I had a friend one time who built a new computer for better performance in a game. I explained to him that the upgrade he had planned(CPU/MoBo/RAM) would not have any drastic affect on his FPS. He did not believe me, purchased the parts anyway, built his system, moved his GPU over, installed Windows and the game and was playing. He saw a difference. He called me to tell me I told you so. I went over, hooked up his old machine, moved his GPU over, watched him play for a bit and actually monitored FPS. Moved GPU back to new machine, did the same thing. Unsurprising to me, there was no difference. After that, he admitted, yeah, he saw no difference anymore. He imagined the difference. It's was simply his subconscious helping him to justify his purchase.

  9. #9
    Deleted
    Quote Originally Posted by Lathais View Post
    and I linked benchmarks showing that in one of those games, the CPU still does not really make much of a difference. In that particular game, OCing also makes zero difference.

    Most of the time, people make an upgrade like their CPU though and they do notice a difference. In my experience, this comes from one of two things.

    One, usually, with a new CPU/MoBo/RAM you do a clean install of Windows, oftentimes to a newer version of Windows. A clean install itself can give an FPS boost, if it's a newer more efficient version of Windows, it's even more(f.ex. going from Vista to 10).

    Two is that people expect an increase, so they see an increase, even if it's not there. I had a friend one time who built a new computer for better performance in a game. I explained to him that the upgrade he had planned(CPU/MoBo/RAM) would not have any drastic affect on his FPS. He did not believe me, purchased the parts anyway, built his system, moved his GPU over, installed Windows and the game and was playing. He saw a difference. He called me to tell me I told you so. I went over, hooked up his old machine, moved his GPU over, watched him play for a bit and actually monitored FPS. Moved GPU back to new machine, did the same thing. Unsurprising to me, there was no difference. After that, he admitted, yeah, he saw no difference anymore. He imagined the difference. It's was simply his subconscious helping him to justify his purchase.
    I was talking about a GTX 1080 upgrade. I went from a 970 to a 1080 and monitored before and after upgrade and it's a totally different world. Especially at 3440x1440 (not in wow..)


    Sent from my iPhone 7 Plus using Tapatalk

  10. #10
    Turn off your shadows!
    Lower your draw distance!

    What's your AA set to?

  11. #11
    You can't afford an upgrade atm, since CPU, motherboard and RAM is the only thing you can do to boost performance in WoW. Wait till you can afford a new K-series i5, or i3 (If you only play WoW), and then you'll have to get a new mobo + DDR4 RAM too.

  12. #12
    The Lightbringer MrPaladinGuy's Avatar
    10+ Year Old Account
    Join Date
    Feb 2012
    Location
    Wherever the pizza is
    Posts
    3,278
    Quote Originally Posted by Atirador View Post
    I know what you mean.

    I switched to a GTX 1080 before buying my X34A.

    I'm now upgrading to a 7700k from my 3570k. I'm bottlenecked by CPU in games like Witcher, ME:A & Ghost recon wildlands and etc..
    Towns in TW3 are notorious for showing CPU bottlenecks provided you have a sufficient GPU.

    Edit - Even if your max framerate doesn't drastically increase, your minimum will.
    Last edited by MrPaladinGuy; 2017-07-15 at 12:05 AM.
    10850k (10c 20t) @ all-core 5GHz @ 1.250v | EVGA 3080 FTW3 Ultra Gaming | 32GB DDR4 3200 | 1TB M.2 OS/Game SSD | 4TB 7200RPM Game HDD | 10TB 7200 RPM Storage HDD | ViewSonic XG2703-GS - 27" IPS 1440p 165Hz Native G-Sync | HP Reverb G2 VR Headset

  13. #13
    Quote Originally Posted by Atirador View Post
    I was talking about a GTX 1080 upgrade.
    Umm, really now.

    Quote Originally Posted by Atirador View Post
    I'm now upgrading to a 7700k from my 3570k. I'm bottlenecked by CPU in games like Witcher, ME:A & Ghost recon wildlands and etc..

    Ok.

  14. #14
    idk my 1070 is bottlenecking my 6700k in wow but the main fps culprits for me is draw distance, resolution scale and stupid AA.

  15. #15
    Stood in the Fire mojo6912's Avatar
    10+ Year Old Account
    Join Date
    Mar 2011
    Location
    USA
    Posts
    433
    overclock moar!~

    ... and turn settings down.

  16. #16
    i may be wrong bit it seems it might be the motherboard cockblocking him, im betting hes on a slow speed ddr3.

  17. #17
    Quote Originally Posted by Zeta333 View Post
    i may be wrong bit it seems it might be the motherboard cockblocking him, im betting hes on a slow speed ddr3.
    Which has zero impact on performance. The difference from DDR-1333 to DDR3-3000 in most tests is within the margin of error. Even with Skylake and Kaby Lake (which are more sensitive to RAM speed), difference between DDR4-2133 and 3600+ is single digit framerates, and only shows up -at all- in a very few games that really stress the RAM (GTA 5, for instance).

    Lathais is doing a fine job leading the charge here, but ill throw my weight behind him - games are NOT as intensive on a lot of components as people think they are. VERY few games are CPU bound unless youre running some 1st Gen Core chip or FX-series clunker. IPC gains have been minimal, and dont always translate into ANY real world performance gains in games. RAM speed, until Ryzen, has almost no affect on games, and even with Ryzen, youre not exactly seeing double-digit framerate increases (OCingnthe CPU will do youna lot better). Dual-channel RAM (or Quad Channel) is largely meaningless outside of VERY specific use-cases and has no impact on gaming. Loads of RAM dont really help you (with gaming).

    People OFTEN beleive they're going to see some miracle gains, but in most cases... nope. Its pure placebo affect. For purely gaming, there is little reason to consider upgrading if you have a 2500/2600K, or any 3, 4 or 6 series chip... and not likely to be one with Coffee Lake either.

    There may be OTHER reasons to upgrade (USB 3/3.1/Type-C, Thunderbolt, M.2 support, or what have you, or maybe you need more cores for Video/Audio work, etc.) - but for purely gaming? Nope.
    Last edited by Kagthul; 2017-07-15 at 05:29 AM.

  18. #18
    Quote Originally Posted by Kagthul View Post
    Which has zero impact on performance. The difference from DDR-1333 to DDR3-3000 in most tests is within the margin of error. Even with Skylake and Kaby Lake (which are more sensitive to RAM speed), difference between DDR4-2133 and 3600+ is single digit framerates, and only shows up -at all- in a very few games that really stress the RAM (GTA 5, for instance).

    Lathais is doing a fine job leading the charge here, but ill throw my weight behind him - games are NOT as intensive on a lot of components as people think they are. VERY few games are CPU bound unless youre running some 1st Gen Core chip or FX-series clunker. IPC gains have been minimal, and dont always translate into ANY real world performance gains in games. RAM speed, until Ryzen, has almost no affect on games, and even with Ryzen, youre not exactly seeing double-digit framerate increases (OCingnthe CPU will do youna lot better). Dual-channel RAM (or Quad Channel) is largely meaningless outside of VERY specific use-cases and has no impact on gaming. Loads of RAM dont really help you (with gaming).

    People OFTEN beleive they're going to see some miracle gains, but in most cases... nope. Its pure placebo affect. For purely gaming, there is little reason to consider upgrading if you have a 2500/2600K, or any 3, 4 or 6 series chip... and not likely to be one with Coffee Lake either.

    There may be OER reasons to upgrade (USB 3/3.1/Type-C, Thunderbolt, M.2 support, or what have you, or maybe you need more cores for Video/Audio work, etc.) - but for purely gaming? Nope.
    so then its just a case of 4k really needing top end equipment to max out fps? i would think he would do fine on 2k or normal res.

  19. #19
    Quote Originally Posted by Zeta333 View Post
    so then its just a case of 4k really needing top end equipment to max out fps? i would think he would do fine on 2k or normal res.
    By and large, hardware is JUST getting to where real 4k gaming is viable.

    The 1080Ti was the first single-card solution for 4k/Ultra/60fps in most games. Before that you had to rely on SLI or Crossfire, and a lot of times, games dont even support those (which is getting more and more common, not less; Multi-GPU is really on its way out).

    Games with a lot of things going on on the screen at once (lots of objects) will be more CPU bound (like WoW), but 4K isn't really the issue.

    What's really hurting him on that 3440x1440 screen is that (because WoW supports ultrawides) hes actually seeing more stuff on screen.

    Ironically, if he was on a traditional 16:9 screen, even a 4k one, he wouldn't be having the issues most likely. Having the extra width means that his CPU has to issue draw calls for more objects because he can literally see more than someone playing at 16:9.

    That's like what is dragging his framerate down.

  20. #20
    Quote Originally Posted by Kagthul View Post
    By and large, hardware is JUST getting to where real 4k gaming is viable.

    The 1080Ti was the first single-card solution for 4k/Ultra/60fps in most games. Before that you had to rely on SLI or Crossfire, and a lot of times, games dont even support those (which is getting more and more common, not less; Multi-GPU is really on its way out).

    Games with a lot of things going on on the screen at once (lots of objects) will be more CPU bound (like WoW), but 4K isn't really the issue.

    What's really hurting him on that 3440x1440 screen is that (because WoW supports ultrawides) hes actually seeing more stuff on screen.

    Ironically, if he was on a traditional 16:9 screen, even a 4k one, he wouldn't be having the issues most likely. Having the extra width means that his CPU has to issue draw calls for more objects because he can literally see more than someone playing at 16:9.

    That's like what is dragging his framerate down.
    You'd almost be correct except, wow drastically lowers FPS when there is more PLAYERS and SPELLS in action rather than your "object" statement. This is also NOT true for other games as they are not coded the same way, in some games you'll find your gpu being stretched to its limits simply by particle effects in your face, underlying coefficients, and hundreds if not thousands of time keyed events. Typically FPS games even ones with multitudes of players in combat are GPU bound because everything else is actually fairly fixed, damage output has a very specific scaling, and movement speeds etc. However there is still FPS games that have taxed your cpu *cough* cs:go, for every extra frame it can gobble down. So trying to generalize any given game world to another is a terrible theory of basis to use.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •