1. #1

    Worthwhile to upgrade to a 3060 or just wait to buy new PC?

    I have an ancient PC (built in 2011 and somehow still kicking). I am considering upgrading the GPU to a 3060 or 3060 TI before I get a new PC. Here are the specs:

    Motherboard: Gigabyte Z68x ud7 b3
    CPU: i7 2600k @3.4 Ghz
    Ram: 32Gb
    GPU: Radeon RX 580 (originally had a GeForce GTX 580 but it died a few years ago)
    PSU: 800W

    If I get it I figure ill just put it in the new PC eventually. From what ive seen it looks like it would be compatible, but im not sure how much of the power i'll get out of it since the motherboard is so old.

  2. #2
    Quote Originally Posted by Ghier View Post
    I have an ancient PC (built in 2011 and somehow still kicking). I am considering upgrading the GPU to a 3060 or 3060 TI before I get a new PC. Here are the specs:

    Motherboard: Gigabyte Z68x ud7 b3
    CPU: i7 2600k @3.4 Ghz
    Ram: 32Gb
    GPU: Radeon RX 580 (originally had a GeForce GTX 580 but it died a few years ago)
    PSU: 800W

    If I get it I figure ill just put it in the new PC eventually. From what ive seen it looks like it would be compatible, but im not sure how much of the power i'll get out of it since the motherboard is so old.
    I would hold off. AMD is releasing it's new Zen5 CPUs and their new 7000 GPUs in the 3rd/4th quarter of this year. Intel is also releasing their new 4000 GPUs probably in the 4th quarter of this year as well as their new 13th gen Raptor Lake CPUs were just pushed back to early 2023 I believe.

    Either way, especially for GPUs, the new AMD 7000 and Intel 4000 series are looking to be monsters compared to even top end GPUs now.

    I'm in the same boat as you, still running an Intel Sandybridge i7 2600K, 16GB memory and a AMD Vega 64 but I'm holding out until the new stuff comes out. Likely your 3060 is bottled-necked by your 2600K anyways.

  3. #3
    If you want to play newer games or games in higher fidelity (Especially 1440p+) then it will be a sizeable upgrade. If you can get it for MSRP or near MSRP (3060 or Ti), great. You'll be able to utilize it quite often, even if your CPU can and will bottleneck you. The gains will still be vastly better than the RX 580. As for the GPU's coming out later this year: Don't worry about it.

    You either get lucky and get a reference card, or overpay for any scalpers who hoard them and then have to wait until third-party cards release (And hope those don't get scalped like the last two generations were). So: Do what you want, but I'd suggest getting it now, selling it later if you really want to get a newer GPU that soon.

    Edit: Driver support is also horrendous on any front during launch and for like the first 3-6 months on any platform. Especially if you don't play mainstream games, from any aisle.

  4. #4
    Banned Strawberry's Avatar
    15+ Year Old Account
    Join Date
    Jul 2007
    Location
    Sweden/Yugoslavia
    Posts
    3,752
    I sold my RTX 3080 in April and thought I'd live with integrated GPU until next-gen.
    It didn't last.
    So I dug around and found a used RTX 3060 for €400. I could've paid much less (like around €300) but I wanted a white one for my white PC.
    I'm impressed with that card! It gives me acceptable framerate at 4k and max details in all games I play. That is at 4k and max details.

    TLDR:
    If you can find it for cheap from someone who is offloading because of next-gen I'd go for it. If you want new, I'd wait, the prices are still high, at least in Europe.

  5. #5
    Quote Originally Posted by PenguinChan View Post
    If you want to play newer games or games in higher fidelity (Especially 1440p+) then it will be a sizeable upgrade. If you can get it for MSRP or near MSRP (3060 or Ti), great. You'll be able to utilize it quite often, even if your CPU can and will bottleneck you. The gains will still be vastly better than the RX 580. As for the GPU's coming out later this year: Don't worry about it.

    You either get lucky and get a reference card, or overpay for any scalpers who hoard them and then have to wait until third-party cards release (And hope those don't get scalped like the last two generations were). So: Do what you want, but I'd suggest getting it now, selling it later if you really want to get a newer GPU that soon.
    It really depends on what your detail and resolution requirements are. If you're still a 1080p/60 gamer, then a 3060 is a good fit, and you can carry forward into a new build when you decide to pull that trigger (id wait till later this year on that part). If you want to do 1440p or better, you're going to need more grunt than a 3060. A 3060Ti can probably hack 1440p/60 consistently, but youd be better off with a 3070. If you want 4k.. your only options are a 3080 or better (or a 6800XT or 6900XT from AMD, or their new 6850/6950 variants/upgrades).

    Or you can wait for the next series to launch, but its looking like the mid-range cards from nVidia at least wont show up until early 2023, just like the 3000 series launch. They seem to be doing the same staggered launch they did with the 3000 series - so itll be 4090 and 4080 first, probably a few weeks or a month apart, and the 4070 at the end of the year. A titular 4060 wont show up until spring.

    If you're OK waiting that long, then sure, i guess.

    Edit: Driver support is also horrendous on any front during launch and for like the first 3-6 months on any platform. Especially if you don't play mainstream games, from any aisle.
    lolno.

    There were no driver issues of any kind for nVidia's 3000 series cards. They worked just fine day one out of the box.

    AMD.... maybe? Theyve always had driver issues, new card or not. But its generally not enough to make the cards unusable.

    And... im not going to quote Strawberry...

    But his claim of 4k max settings on a 3060 is utter, 1000% bullshit. Ive got a 3080 FE and 4k/60 is BARELY possible at a mix of high/ultra settings. And in some really demanding AAA games, it wont maintain 60fps at all times. No way a 3060 is going to do 4k high settings and not be a slideshow.

  6. #6
    Your CPU is ancient and that will old back your system to the point I don't think your GPU matters that much.
    And I wouldn't be shocked if your CPU wouldn't be able to handle new games (not remasters) anyway which at that point you might as well save up and get a new system.

    - - - Updated - - -

    Quote Originally Posted by Kagthul View Post
    It really depends on what your detail and resolution requirements are. If you're still a 1080p/60 gamer, then a 3060 is a good fit, and you can carry forward into a new build when you decide to pull that trigger (id wait till later this year on that part). If you want to do 1440p or better, you're going to need more grunt than a 3060. A 3060Ti can probably hack 1440p/60 consistently, but youd be better off with a 3070. If you want 4k.. your only options are a 3080 or better (or a 6800XT or 6900XT from AMD, or their new 6850/6950 variants/upgrades).

    Or you can wait for the next series to launch, but its looking like the mid-range cards from nVidia at least wont show up until early 2023, just like the 3000 series launch. They seem to be doing the same staggered launch they did with the 3000 series - so itll be 4090 and 4080 first, probably a few weeks or a month apart, and the 4070 at the end of the year. A titular 4060 wont show up until spring.

    If you're OK waiting that long, then sure, i guess.



    lolno.

    There were no driver issues of any kind for nVidia's 3000 series cards. They worked just fine day one out of the box.

    AMD.... maybe? Theyve always had driver issues, new card or not. But its generally not enough to make the cards unusable.

    And... im not going to quote Strawberry...

    But his claim of 4k max settings on a 3060 is utter, 1000% bullshit. Ive got a 3080 FE and 4k/60 is BARELY possible at a mix of high/ultra settings. And in some really demanding AAA games, it wont maintain 60fps at all times. No way a 3060 is going to do 4k high settings and not be a slideshow.
    He could get a 3090 and his games wouldn't be able run a stable 60FPS because of his CPU.

  7. #7
    Quote Originally Posted by ati87 View Post
    He could get a 3090 and his games wouldn't be able run a stable 60FPS because of his CPU.
    Insert Dr. Cox "wrong wrong wrong" meme here.

    the 2600K is actually still a decent gaming CPU - its about equivalent to a modern i3. I mean, its not top end, but its not going to meaningfully choke a GPU at 1080p, and if he scales up to 1440p or higher where CPU is almost never an issue.

    Most modern games outside of secure client-server games like an MMO or the Battle Royale style shooters are not CPU bound remotely.

    Jay2Cents did a "what will it take to bottleneck this 3080" video a while back (around the launch of the 3080) and they had to go down to a dual-core 2nd gen i3 before seeing any meaningful bottlenecking. Like, even FX-series chips that werent overclocked were fine.

    Did they perform as well as modern CPUs?

    No, of course not. But does that matter if you're still getting framerates higher than the refresh rate of your monitor?

    Nope.

    Too many people harp on "bottlenecking" even though it almost never happens, or if it does, its limiting your max potential performance but not impacting your gameplay because its all still well above the refresh rate of your monitor. Like, yeah, a modern CPU might get 500fps in a game at 1080p, and that 2600K might only get 150.

    .... so what?

    And, as i pointed out, if hes a 1080p gamer a 3060 is a great choice, and one he can carry forward into a new build, which as you can see, i recommend doing later this year.

  8. #8
    Quote Originally Posted by Ghier View Post
    I have an ancient PC (built in 2011 and somehow still kicking). I am considering upgrading the GPU to a 3060 or 3060 TI before I get a new PC. Here are the specs:

    Motherboard: Gigabyte Z68x ud7 b3
    CPU: i7 2600k @3.4 Ghz
    Ram: 32Gb
    GPU: Radeon RX 580 (originally had a GeForce GTX 580 but it died a few years ago)
    PSU: 800W

    If I get it I figure ill just put it in the new PC eventually. From what ive seen it looks like it would be compatible, but im not sure how much of the power i'll get out of it since the motherboard is so old.
    I just upgraded from a 4770k/rx 580 myself. Getting a 3060 now would help play newer games/higher resolutions, and like others said you can always put it in a future new rig.

    If you want to pull the trigger for a new pc now, and you can without being financially unsafe, then it's not a bad time to. There will always be new releases right around the corner, but wanting to upgrade now is a matter of how urgent you think it is.

  9. #9
    Quote Originally Posted by Kagthul View Post
    Insert Dr. Cox "wrong wrong wrong" meme here.

    the 2600K is actually still a decent gaming CPU - its about equivalent to a modern i3. I mean, its not top end, but its not going to meaningfully choke a GPU at 1080p, and if he scales up to 1440p or higher where CPU is almost never an issue.

    Most modern games outside of secure client-server games like an MMO or the Battle Royale style shooters are not CPU bound remotely.

    Jay2Cents did a "what will it take to bottleneck this 3080" video a while back (around the launch of the 3080) and they had to go down to a dual-core 2nd gen i3 before seeing any meaningful bottlenecking. Like, even FX-series chips that werent overclocked were fine.

    Did they perform as well as modern CPUs?

    No, of course not. But does that matter if you're still getting framerates higher than the refresh rate of your monitor?

    Nope.

    Too many people harp on "bottlenecking" even though it almost never happens, or if it does, its limiting your max potential performance but not impacting your gameplay because its all still well above the refresh rate of your monitor. Like, yeah, a modern CPU might get 500fps in a game at 1080p, and that 2600K might only get 150.

    .... so what?

    And, as i pointed out, if hes a 1080p gamer a 3060 is a great choice, and one he can carry forward into a new build, which as you can see, i recommend doing later this year.
    Look up the system requirements of games like Guardians of the Galaxy, RE Village, Halo, God of War, Elden Ring AC Valhalla.
    The bottleneck will be his CPU which is a fact and some of these above games (cross generation games mostly) are requiring 4th generation CPU and Elden Ring is asking a i5-8400 while and RE is asking i5-7500.

  10. #10
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by ati87 View Post
    Look up the system requirements of games like Guardians of the Galaxy, RE Village, Halo, God of War, Elden Ring AC Valhalla.
    The bottleneck will be his CPU which is a fact and some of these above games (cross generation games mostly) are requiring 4th generation CPU and Elden Ring is asking a i5-8400 while and RE is asking i5-7500.
    Just because they're asking for it doesn't mean you actually need that.
    They're purposefully overkill so they're 100% sure you don't have a bad experience.

  11. #11
    Quote Originally Posted by Temp name View Post
    Just because they're asking for it doesn't mean you actually need that.
    They're purposefully overkill so they're 100% sure you don't have a bad experience.
    Yerp. Halo Infinite runs fine on an i3 10100 in my HTPC. Its paired with a 1060, so its not running max settings or anything, but its not the CPU holding it back, i assure you. RE ran fine on my wifes old i5 6600K before we updated her rig to Ryzen. Paired with a 1080, ran everything at 1080p at a very stable 60fps (which IIRC is locked in the engine). Jay did Elden Ring on a 12400 and a 3050, and it was fine (the issues and slowdowns were engine-based since the game is optimized like garbage, as they persisted on overclocked 12900’s) at 1080p med/high. It wasnt even using all the cores.

    Requirements and suggestions are always overblown, for the reason Temp pointed out. Hell, Ubisoft used to say Far Cry 5 wouldnt even run on dual-core CPUs… and when it was hacked to remove the CPU check, it ran just fine on lowly dual-core i3s.

    You also are still avoiding where literally no ine is suggesting that he stick the 2600K long term.

  12. #12
    Quote Originally Posted by ati87 View Post
    He could get a 3090 and his games wouldn't be able run a stable 60FPS because of his CPU.
    He will be fine. Should he update his CPU? Of course, he'd get better frame timing and lower latency with better general performance, but for all intents and purposes, it's just fine for modern gaming.

    Quote Originally Posted by Kagthul View Post
    It really depends on what your detail and resolution requirements are. If you're still a 1080p/60 gamer, then a 3060 is a good fit, and you can carry forward into a new build when you decide to pull that trigger (id wait till later this year on that part). If you want to do 1440p or better, you're going to need more grunt than a 3060. A 3060Ti can probably hack 1440p/60 consistently, but youd be better off with a 3070. If you want 4k.. your only options are a 3080 or better (or a 6800XT or 6900XT from AMD, or their new 6850/6950 variants/upgrades).

    Or you can wait for the next series to launch, but its looking like the mid-range cards from nVidia at least wont show up until early 2023, just like the 3000 series launch. They seem to be doing the same staggered launch they did with the 3000 series - so itll be 4090 and 4080 first, probably a few weeks or a month apart, and the 4070 at the end of the year. A titular 4060 wont show up until spring.

    If you're OK waiting that long, then sure, i guess.

    lolno.

    There were no driver issues of any kind for nVidia's 3000 series cards. They worked just fine day one out of the box.

    AMD.... maybe? Theyve always had driver issues, new card or not. But its generally not enough to make the cards unusable.

    And... im not going to quote Strawberry...

    But his claim of 4k max settings on a 3060 is utter, 1000% bullshit. Ive got a 3080 FE and 4k/60 is BARELY possible at a mix of high/ultra settings. And in some really demanding AAA games, it wont maintain 60fps at all times. No way a 3060 is going to do 4k high settings and not be a slideshow.
    The 3060 (Especially the ti) is capable of 1440p 16:9 gaming with 60fps or higher if you play anything not super modern. And even modern games, just push the settings down - with DLSS, FSR, or raw resolution scaling if the game only has that. Hell, use either one's built in scaling options and those work wonders, too. I used the 5700xt (2600x CPU) on a 21:9 1440 monitor and got 60fps+ in a lot of games without scaling, and minor scaling helped tremendously (going upwards of 90 - 100 fps in a lot of more recent games). 5700 xt is similarly powerful as the 3060, so I see no reason why it couldn't do the same. The TI, obviously, would without any doubt. Even with his CPU. Don't lie to the poor person, lol.

    3070 can do scaling 4k, natively if it's an older game. 3070 TI / 3080+ can do 4k natively more often, or a much higher scaler. 3080 TI+ is best for native 4k. Of course this is all moot when you talk about older games, as even the 3060 can do 4k on things from 4/5 years ago. As for the 3060 and 4K - he's not actually wrong. A lot of modern games can run on that kind of card 4k / 24-30 FPS. They said acceptable, not recommended.

    And for your 3080: That sounds really weird. I run 32:9 1440p with a 3080, and that tends to run 60 FPS just fine.

  13. #13
    Quote Originally Posted by PenguinChan View Post
    He will be fine. Should he update his CPU? Of course, he'd get better frame timing and lower latency with better general performance, but for all intents and purposes, it's just fine for modern gaming.



    The 3060 (Especially the ti) is capable of 1440p 16:9 gaming with 60fps or higher if you play anything not super modern. And even modern games, just push the settings down - with DLSS, FSR, or raw resolution scaling if the game only has that. Hell, use either one's built in scaling options and those work wonders, too. I used the 5700xt (2600x CPU) on a 21:9 1440 monitor and got 60fps+ in a lot of games without scaling, and minor scaling helped tremendously (going upwards of 90 - 100 fps in a lot of more recent games). 5700 xt is similarly powerful as the 3060, so I see no reason why it couldn't do the same. The TI, obviously, would without any doubt. Even with his CPU. Don't lie to the poor person, lol.

    3070 can do scaling 4k, natively if it's an older game. 3070 TI / 3080+ can do 4k natively more often, or a much higher scaler. 3080 TI+ is best for native 4k. Of course this is all moot when you talk about older games, as even the 3060 can do 4k on things from 4/5 years ago. As for the 3060 and 4K - he's not actually wrong. A lot of modern games can run on that kind of card 4k / 24-30 FPS. They said acceptable, not recommended.

    And for your 3080: That sounds really weird. I run 32:9 1440p with a 3080, and that tends to run 60 FPS just fine.
    32:9 1440p is still 15%-ish less pixels than 4k, FWIW.

    And i dont bring up scaling or turning settings down because assuming not doing that is the only way to have an equal comparison and/or make sure we're all on the same page.

    Otherwise, ill say "you wont get over 60fps 100% of the time at 4k with a 3080" because im referring to "i just set the game at max and play", which is the default behaviour of most gamers...

    And youll say "you can EASILY get over 60fps with a 3080!" because you're turning on DLSS and turning down settings.

    We're both right, but we're talking about two entirely different things. The general assumption is apples-to-apples comparisons, and the default mode when talking about FPS expectations is "max settings" at any given resolution.

    FWIW, yeah, i can do games at 4K with my 3080 by turning down the settings (and usually see no visual fidelity difference, because often times the difference between a "high" setting and an "ultra" setting isnt even noticeable) and turning on DLSS. I dont actually game at 4K (did the testing on my TV) so for me, at 1440p high refresh, its a great card and does fantastic, and i dont notice turning on DLSS to keep things above 100fps. But i wouldnt want to say "yeah, youll have no problems running 4K with a 3080" because if someone goes and buys the card and throws it in and fires up a game and sets it all to max... hes gonna have serious problems. His 1% lows are not going to be above 60fps except in a few well optimized games (like Doom Eternal).

    I had a 3060 (bought it for my wife right after it launched) in my wifes rig for about 6 weeks before we upgraded her to a 3070 (caught a guy selling one at MSRP on FB Market, and had a friend that would buy the 3060 from us for what we paid for, which was also a FB Market deal and only 40$ over MSRP, which i just chalked up to an "i didnt thave to wait in line or pay shipping" fee).

    It did NOT do 1440p/60fps at all times. 1% lows would constantly hit the 40s. The 3060Ti can probably swing it, but it wouldnt want to bet the farm on that. ANd juddery 1% lows will kill any experience, even if the "Average" FPS is over 100.

    You might also want to check your assumption on games from "4-5 years ago" running at 4k even on a 3070; remember that we lost two years to Covid, a lot of games you think of as new are probably actually 3-4 years old.

    But if you're just doing 1080p/60, the 3060 is a dynamite card. Itll even do middling high refresh (80s-90s) and paired with an adaptive sync monitor itd be buttery smooth.

  14. #14
    Sounds like the 3060 would be a decent upgrade for now. I only play on 1080p and the most demanding game I play is Fortnite on low settings. My fps drops to like 40 sometimes which is unbearable. Would like to get a consistent 140 fps.

  15. #15
    Quote Originally Posted by Ghier View Post
    Sounds like the 3060 would be a decent upgrade for now. I only play on 1080p and the most demanding game I play is Fortnite on low settings. My fps drops to like 40 sometimes which is unbearable. Would like to get a consistent 140 fps.
    Your CPU may prevent that, as there are some settings that are not reliant on the GPU, but rather the CPU - like draw distance. But id think that even with the 2600K you should be able to keep it averaging over 100fps and your 1% lows should stay above 60.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •