Page 1 of 2
1
2
LastLast
  1. #1

    how good is my pc actually

    so i built my pc about 2 years ago. i feel like i should be upgrading it over time i just dont know where to start.
    it has a asrock z370 killer SLI/ac with a i58600k on it that is cooled with a corsair water cooling kit from best buy. i have 2 8G sticks of ram along with a 250g ssd and a 2T hdd. lastly is my gtx 970 gpu.
    i know its a decent build but with how fast pc gaming advances i would like a push in which direction to upgrade first. im assuming the gpu but i wanted to ask first. thanks in advance for any input

  2. #2
    Quote Originally Posted by activewheat View Post
    but with how fast pc gaming advances
    ... Glacially slow?

    You can still game acceptably at high framerates on a nearly 10 year old CPU.

    The only weak part of your rig is maybe the GPU (which is still fine for 1080p).

    Everything else is fine.

  3. #3
    Moderator chazus's Avatar
    10+ Year Old Account
    Join Date
    Nov 2011
    Location
    Las Vegas
    Posts
    17,222
    Are you having any issues with the system?

    Why do you feel a need to upgrade for any reason?
    Gaming: Dual Intel Pentium III Coppermine @ 1400mhz + Blue Orb | Asus CUV266-D | GeForce 2 Ti + ZF700-Cu | 1024mb Crucial PC-133 | Whistler Build 2267
    Media: Dual Intel Drake Xeon @ 600mhz | Intel Marlinspike MS440GX | Matrox G440 | 1024mb Crucial PC-133 @ 166mhz | Windows 2000 Pro

    IT'S ALWAYS BEEN WANKERSHIM | Did you mean: Fhqwhgads
    "Three days on a tree. Hardly enough time for a prelude. When it came to visiting agony, the Romans were hobbyists." -Mab

  4. #4
    Quote Originally Posted by Kagthul View Post
    ... Glacially slow?

    You can still game acceptably at high framerates on a nearly 10 year old CPU.

    The only weak part of your rig is maybe the GPU (which is still fine for 1080p).

    Everything else is fine.
    You can't even play WoW with all settings on 1 with the BiS CPU from 10 years ago. Yes, you can technically run the game, but the performance will not even be close to any kind of acceptable performance.
    They're (short for They are) describes a group of people. "They're/They are a nice bunch of guys." Their indicates that something belongs/is related to a group of people. "Their car was all out of fuel." There refers to a location. "Let's set up camp over there." There is also no such thing as "could/should OF". The correct way is: Could/should'VE, or could/should HAVE.
    Holyfury armory

  5. #5
    Quote Originally Posted by ThrashMetalFtw View Post
    You can't even play WoW with all settings on 1 with the BiS CPU from 10 years ago. Yes, you can technically run the game, but the performance will not even be close to any kind of acceptable performance.
    The first generation of quadcore i7 processors came out between Nov 2008 and feb 2010. Those would still yield acceptable results if you turned some settings down. Not amazing, but acceptable.

  6. #6
    Quote Originally Posted by ThrashMetalFtw View Post
    You can't even play WoW with all settings on 1 with the BiS CPU from 10 years ago. Yes, you can technically run the game, but the performance will not even be close to any kind of acceptable performance.
    On i7-960? Ofc you can.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  7. #7
    Quote Originally Posted by ThrashMetalFtw View Post
    You can't even play WoW with all settings on 1 with the BiS CPU from 10 years ago. Yes, you can technically run the game, but the performance will not even be close to any kind of acceptable performance.
    And yet i have a Core 2 Quad sitting here with a newer GPU (GTX 660) and it runs WoW just fine at 1080p.

    I will admit to being slightly off, though, as i was speaking about the i5 2500K or i7 2600K, which, my bad, are only going nine years old, not ten.

    Also, JayzTwoCents literaly just did a video on a 10 year old machine. A core 2 Quad, i do believe, that they added a boot SSD to and threw in a 1050Ti, and it runs a lot of games at 1080p/60, and newer games at 1080p/30 (which is playable, wether you like the way it looks or not).

    CPU performance hasn't been evolving very quickly; Even Ryzen isnt (currently) a big leap forward - its just AMD finally catching up (and now just SLIGHTLY surpassing) where Intel has been for the last ~7 years.

    PC Gaming doesn't evolve quickly. GPU technology is... but you dont need a high-end GPU to game at 1080p/60 at good settings. And you can, in a lot of cases, get by with even a 6+ year old rig by just upgrading to something like an RX 580 or 1660.

  8. #8
    Herald of the Titans Maruka's Avatar
    10+ Year Old Account
    Join Date
    Mar 2011
    Location
    Alberta
    Posts
    2,554
    At 1080p you are fine, especially if you want 60hz.

  9. #9
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    I would consider that a badass PC to have today. The only thing holding it back is the GTX 970, but that's great for 1080p gaming so unless you run games at 1440p or 4k then why upgrade? Also, isn't a i5 8600k fairly new? You won't get much faster with newer CPUs. Is it overclocked?

    The problem with upgrading a PC is there's no limit in how fast you can go. How fast you can go depends on how much money you have to spend. The CPU is fine honestly but the GPU is... complicated. The way I see it is either get a RTX 2060 or wait next year until Nvidia/AMD/Intel release their new GPUs. The reason for a RTX 2060 is because it can do Ray-Tracing, and I know I've talked bad about Ray-Tracing in the past but since the PS5 and Xbox Series X is going to have Ray-Tracing then that means all future games will probably require a Ray-Tracing capable GPU. I wouldn't get a RTX 2070 or 2080 because they do Ray-Tracing poorly and that money would be better spend on Nvidia's RTX 3000 series or whatever AMD is going to release next year. The RTX 2060 just would satisfy a requirement in future games and nothing more. I personally would be happy with that GTX 970 and wait and see what happens next year when mid range GPU's aren't priced like high end GPU's. As it is right now the AMD RX 5700 is dropping rapidly in price because few people are buying them. You can find a 5700 for around $300 which was originally $380 which AMD dropped before release to $350. The 5700 XT can be found bellow $400 which was suppose to be $500 and then dropped to $450 before release. Even the RTX 2060 can be found for $330 because the demand isn't so high. Everything is just overpriced for no reason and selling poorly. Prices will inevitably drop.

  10. #10
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Your ssd is small and your gpu is mediocre. Everything else looks fine.

    I wouldn't upgrade unless you actually have issues with something

    - - - Updated - - -

    Quote Originally Posted by Vash The Stampede View Post
    The reason for a RTX 2060 is because it can do Ray-Tracing, and I know I've talked bad about Ray-Tracing in the past but since the PS5 and Xbox Series X is going to have Ray-Tracing then that means all future games will probably require a Ray-Tracing capable GPU.
    No they won't. Thinking they will is like thinking all games in the past 6 years have required an 8-core CPU since the PS4 and Xbox 1 had those.

    Besides, since they won't release for another year, there's no reason to buy a 2060 now when in the next year we'll get the 3000 series launch, and AMD's next generation of GPU, which will also be featuring hardware ray-tracing (Since they're supplying the SOCs for the new consoles).
    Or you can do ray-tracing with software, like how Microsoft or Crytek are doing. Hell, Crytek's solution works quite well and has for half a year or so, if you've forgotten https://youtu.be/kGxqiw8UWns

    Or, since the games that release at the start of a console generation are either exclusives meant to ship hardware, or also able to be played on the last-gen consoles, none of the games he'd be able to play on PC for the first year or so of the consoles being out would have it either, meaning he could probably wait until the Nvidia 4000 series and just be a few months behind in his gaming library

  11. #11
    blizzard is making shadow lands dx12 required to play the expansion. intell is about 100$ more USD to buy than AMD. AMD has rock solid hardware for sale now. that pretty much sums up the experience. though ive seen AMD drivers better in the past.
    Last edited by Naiattavain; 2019-12-14 at 06:27 AM.
    “Choose a job you love and you'll never have to work a day in your life” “Logic will get you from A to Z; Imagination will get you everywhere.”

  12. #12
    My rig does fine at 1080p and 60mhz. When the time comes where I can't play something I want to, then I'll upgrade. It will be all AMD again.

    - - - Updated - - -

    Quote Originally Posted by Proper Ninja View Post
    blizzard is making shadow lands dx12 required to play the expansion. intell is about 100$ more USD to buy than AMD. AMD has rock solid hardware for sale now. that pretty much sums up the experience. though ive seen AMD drivers better in the past.
    The last I heard DX12 will not be required. Recommended but not required. Has that changed?
    Desktop ------------------------------- Laptop- Asus ROG Zephyrus G14
    AMD Ryzen 5 5600X CPU ---------------AMD Ryzen 9 6900HS with Radeon 680M graphics
    AMD RX 6600XT GPU -------------------AMD Radeon RX 6800S discrete graphics
    16 GB DDR4-3200 RAM ----------------16 GB DDR5-4800 RAM
    1 TB WD Black SN770 NVMe SSD ------1 TB WD Black SN850 NVMe SSD

  13. #13
    A 9700k is about as good as you can get for gaming right now. I have a PC with an 8700k and an 8086k and they both basically match the 9700k.

    Overclock that 8600k to a solid 4.8ghz at least and you’ll have almost no reason to upgrade during the next console cycle CPU wise. I’d be surprised if the CPU in the new consoles would be able to outpace your 8600k. Your weak link is the 970 but it’s still fine for 1080p. If it does what you want it to do then keep it and wait until you actually need one.

  14. #14
    Quote Originally Posted by Dch48 View Post
    My rig does fine at 1080p and 60mhz. When the time comes where I can't play something I want to, then I'll upgrade. It will be all AMD again.

    - - - Updated - - -



    The last I heard DX12 will not be required. Recommended but not required. Has that changed?
    DX12 is required to run the game. this is for shadow lands only so i guess they are cutting everyone out from windows 7.
    “Choose a job you love and you'll never have to work a day in your life” “Logic will get you from A to Z; Imagination will get you everywhere.”

  15. #15
    Quote Originally Posted by Proper Ninja View Post
    DX12 is required to run the game. this is for shadow lands only so i guess they are cutting everyone out from windows 7.
    Don't think they need to cut off Win 7. Win 7 already has DirectX 12 support with World of Warcraft in the 8.2 patch i think?

  16. #16
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Temp name View Post
    No they won't. Thinking they will is like thinking all games in the past 6 years have required an 8-core CPU since the PS4 and Xbox 1 had those.
    Games don't require 8 cores because the consoles CPU's are so weak that a dual core Intel Pentium would out perform them. Games recently are getting more into needing more threads to perform better but not something that's essential. GPU features are a different story as developers aren't going to put in a Ray-Tracing feature and then allow you to disable it just because. AMD and Nvidia will back this up because it'll sell GPU's, as they've done in the past with DX8/DX9/DX10/DX11 and etc.
    Besides, since they won't release for another year, there's no reason to buy a 2060 now when in the next year we'll get the 3000 series launch, and AMD's next generation of GPU, which will also be featuring hardware ray-tracing (Since they're supplying the SOCs for the new consoles).
    I'm just saying if you need something better than a... R9 290/290, GTX 970/980, R9 390/390X, GTX 1060 6GB/3GB, RX 470/480/570/580/590... "Breaths in heavily"... GTX 1660/1650/1660Ti/1660Super, and now the RX 5500/5500XT. Anyone here notice a problem in that the industry keeps pumping out 1080P graphic cards since 2013/2014 for around $200-$350? Anyway, anything better than these cards are a waste of money if they aren't doing Ray-Tracing, and Nvidia's RTX cards are horrible at Ray-Tracing and over priced to oligopoly levels. Stick to the plethora of 1080p cards that exist in the market or go RTX 2060, and I recommend to stick to those 1080p cards until next year. Until after Nvidia launches their RTX 3000 series as well as AMD's and Intel's new GPUs.
    Or you can do ray-tracing with software, like how Microsoft or Crytek are doing. Hell, Crytek's solution works quite well and has for half a year or so, if you've forgotten https://youtu.be/kGxqiw8UWns
    I personally like Crytek's solution as I've ran it on my machine which currently has a R9 Fury and it runs fantastic. It doesn't make use of DX12 or Vulkan which suggests there's room for improvement as well. But without knowing what direction the industry will take for Ray-Tracing we won't know what's needed to play future titles.
    Or, since the games that release at the start of a console generation are either exclusives meant to ship hardware, or also able to be played on the last-gen consoles, none of the games he'd be able to play on PC for the first year or so of the consoles being out would have it either, meaning he could probably wait until the Nvidia 4000 series and just be a few months behind in his gaming library
    Maybe, but you know there's going to be a few games that will require Ray-Tracing.

    Quote Originally Posted by acphydro View Post
    I’d be surprised if the CPU in the new consoles would be able to outpace your 8600k. Your weak link is the 970 but it’s still fine for 1080p. If it does what you want it to do then keep it and wait until you actually need one.
    My prediction is that the PS5 and Xbox Series X will probably cut the L3 cache out to save money. No L3 cache means no where near the IPC of today's CPU's. I'd even go as far to say they'll be slower than a i5 2500K in terms of IPC.

  17. #17
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Vash The Stampede View Post
    Games don't require 8 cores because the consoles CPU's are so weak that a dual core Intel Pentium would out perform them. Games recently are getting more into needing more threads to perform better but not something that's essential. GPU features are a different story as developers aren't going to put in a Ray-Tracing feature and then allow you to disable it just because. AMD and Nvidia will back this up because it'll sell GPU's, as they've done in the past with DX8/DX9/DX10/DX11 and etc.
    And they're all backwards compatible on PC a couple generations at least. We're still getting games released now that work on DX10/11 when DX12 has been out for years.

    I'm just saying if you need something better than a... R9 290/290, GTX 970/980, R9 390/390X, GTX 1060 6GB/3GB, RX 470/480/570/580/590... "Breaths in heavily"... GTX 1660/1650/1660Ti/1660Super, and now the RX 5500/5500XT. Anyone here notice a problem in that the industry keeps pumping out 1080P graphic cards since 2013/2014 for around $200-$350? Anyway, anything better than these cards are a waste of money if they aren't doing Ray-Tracing, and Nvidia's RTX cards are horrible at Ray-Tracing and over priced to oligopoly levels. Stick to the plethora of 1080p cards that exist in the market or go RTX 2060, and I recommend to stick to those 1080p cards until next year. Until after Nvidia launches their RTX 3000 series as well as AMD's and Intel's new GPUs.
    Nvidia's cards aren't exactly bad at raytracing, it's just that raytracing is REALLY hard to do in real-time (Seroiusly, do you know how much math goes into it?). Movies and similar have used ray-tracing for decades, and they take minutes to hours per frame. Nvidia managed to get some consumer-grade cards out that can do 50-60 a second. That's still impressive. If you really think they're bad at Raytracing you don't know what you're talking about.

    I personally like Crytek's solution as I've ran it on my machine which currently has a R9 Fury and it runs fantastic. It doesn't make use of DX12 or Vulkan which suggests there's room for improvement as well. But without knowing what direction the industry will take for Ray-Tracing we won't know what's needed to play future titles.
    Probably the direction that doesn't include raytracing. Or, for games that come on PC, with options to turn them off. With how much of the general game-playing public that still don't have anything nearly powerful enough to do it, releasing a game that doesn't have the option to turn it off is suicide for your game

    Maybe, but you know there's going to be a few games that will require Ray-Tracing.
    Yes, exclusives that are designed to showcase how pretty the games can look and how powerful the hardware are. Because, again, requiring such a high-end feature will be suicide for your game in the PC market. Crysis, when it launched, still had decent enough options that you could get it playable on a shitter-computer, even if it was the high-end benchmark game for years

  18. #18
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Temp name View Post
    Nvidia's cards aren't exactly bad at raytracing, it's just that raytracing is REALLY hard to do in real-time (Seroiusly, do you know how much math goes into it?). Movies and similar have used ray-tracing for decades, and they take minutes to hours per frame. Nvidia managed to get some consumer-grade cards out that can do 50-60 a second. That's still impressive. If you really think they're bad at Raytracing you don't know what you're talking about.
    They're bad because once you turn on Ray-Tracing the frame rate drops dramatically, which suggests that the RTX cards may not have as much dedicated hardware for Ray-Tracing as we thought. The Crytek demo makes sense since you're not using dedicated hardware but the entire GPU for Ray-Tracing. This sort of thing happened before with Nvidia and DX10 where turning on shadows would destroy your frame rate, but then AMD comes out with DX10.1 cards and their version of shadows has minimal impact on performance. I'm kinda thinking that's what will happen with Ray-Tracing.
    Probably the direction that doesn't include raytracing. Or, for games that come on PC, with options to turn them off. With how much of the general game-playing public that still don't have anything nearly powerful enough to do it, releasing a game that doesn't have the option to turn it off is suicide for your game
    Once the PS5 and Xbox Series X is released a lot of us aren't going to have a GPU as fast as these consoles, if Steam hardware survey is anything to go by. Depends on the games released but most games are going to demand more than what a GTX 970 or RX580 can produce.
    Yes, exclusives that are designed to showcase how pretty the games can look and how powerful the hardware are. Because, again, requiring such a high-end feature will be suicide for your game in the PC market. Crysis, when it launched, still had decent enough options that you could get it playable on a shitter-computer, even if it was the high-end benchmark game for years
    The PC market has always had games that pushed the line in hardware requirements. Doom, Quake, Quake3, and even Doom 3 have all pushed people to buy greater hardware than what most people had at the time. Half Life 2 was responsible for the death for the Geforce FX cards as they couldn't do proper DX9, though it did offer DX8 and even DX6 mods to play the game. Crysis failed because at the time of that games release the cost of a good graphics card was easily $500+. A GeForce 8800 GTX was $570 while the Radeon HD 2900 XT was $400. The Xbox 360 and PS3 looked pretty enticing at that time. I had a Radeon X1950 Pro which was cheap at that time and had no problem playing Crysis but like today most people aim at Nvidia hardware. Today we're used to the idea that the best graphic cards are $600+, The MSRP of a ATI Radeon 9700 Pro was $400 back in 2002, while today for that much you get a RTX 2060 which is not considered midrange, so says Nvidia.

  19. #19
    It's fine. Upgrade GPU next gen. If you want to spend now get a 1TB SSD for about £100.

  20. #20
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Vash The Stampede View Post
    They're bad because once you turn on Ray-Tracing the frame rate drops dramatically, which suggests that the RTX cards may not have as much dedicated hardware for Ray-Tracing as we thought. The Crytek demo makes sense since you're not using dedicated hardware but the entire GPU for Ray-Tracing. This sort of thing happened before with Nvidia and DX10 where turning on shadows would destroy your frame rate, but then AMD comes out with DX10.1 cards and their version of shadows has minimal impact on performance. I'm kinda thinking that's what will happen with Ray-Tracing.
    Maybe. We don't know.

    Once the PS5 and Xbox Series X is released a lot of us aren't going to have a GPU as fast as these consoles, if Steam hardware survey is anything to go by. Depends on the games released but most games are going to demand more than what a GTX 970 or RX580 can produce.
    We don't know what kind of hardware is in them yet though.

    The PC market has always had games that pushed the line in hardware requirements. Doom, Quake, Quake3, and even Doom 3 have all pushed people to buy greater hardware than what most people had at the time. Half Life 2 was responsible for the death for the Geforce FX cards as they couldn't do proper DX9, though it did offer DX8 and even DX6 mods to play the game. Crysis failed because at the time of that games release the cost of a good graphics card was easily $500+. A GeForce 8800 GTX was $570 while the Radeon HD 2900 XT was $400. The Xbox 360 and PS3 looked pretty enticing at that time. I had a Radeon X1950 Pro which was cheap at that time and had no problem playing Crysis but like today most people aim at Nvidia hardware. Today we're used to the idea that the best graphic cards are $600+, The MSRP of a ATI Radeon 9700 Pro was $400 back in 2002, while today for that much you get a RTX 2060 which is not considered midrange, so says Nvidia.
    Just saying, you're starting to sound a lot like the guy in the Stadia thread saying that Stadia is the future.

    Raytracing might be the future, but the Xbox Series X and PS5 won't bring it. They'd need a GPU that can ray-trace better than a 2080ti.. Which, nah, ain't fucking happening. Even ignoring the jacked up prices, the x80ti series go for 700 dollars, or with them, we're looking at 1200. You're expecting that kind, or better, performance in a box that costs 400?
    Or hell, let's just say that AMD can magically bring out something that'll perform 4x better at the same cost (Which seems unlikely given their current offerings of standalone PC GPUs, but let's assume). Congratz, now you have a card at 7-1200 dollars that can do Raytracing at 4k60fps (2080ti can do RTX at 1080p60, so 4x performance means you can do it at 4k60). And you still need to cram it into a box that sells for 400. In order for it to make sense, they'd need to beat the 2080ti's performance per dollar in ray-tracing applications about 12 times over, at minimum, otherwise they won't be able to get it cheap enough to slap it in a console.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •