
Originally Posted by
Kagthul
This is one of those things where there's a distinction without a difference.
Yes, Intel sockets generally only last two chip generations. AM4 (very technically) lasted 4 (Ryzen 1000, 2000, 3000, 5000 - there are some APUs that were labelled 4000 series chips but those were just 3000-series chips with GPU cores on them)...
But only sorta. A lot of motherboards, particularly on the lower end (the entire original 300 series boards, many lower end to low-mid-range 400 series boards) couldn't be updated to support later chips... so while they used the same socket, they werent necessarily compatible and it could be very confusing because some boards from the same manufacturer that were almost indentical had different upgrade possibilities.
However, in both cases, the usefulness of in-place upgrades is near zero. 99%+ of users never do an in-socket upgrade, regardless of how long the socket is "viable" for. If you had an 8600K, like i do (released in 2017), you were never going to be dropping in a 10600K or something. The performance uplift was never going to be worth the 300$. Even with 12th gen available (going on 5 years later) i have no intention of upgrading. The 8600K still does everything i need it to do exceptionally well, with a single mid-life GPU upgrade (started with a 1080Ti, went to a 3080).
So if id been on an AM4 platform.... i still wouldn't have upgraded, so the same socket being used for 4-5 years is materially irrelevant. About the only possible realistic benefit is that if the CPU dies for some reason (ive never had this happen in 500+ depolyed machines, its always been the board that packed it in) you could drop in a new one without having to pay potentially inflated "no longer in production" prices.
But thats so rediculously niche that its nearly irrelevant.
So, its a nice presentation talking point that has literally zero actual application for 99% of users. AM4 lasted 5 whole years! So what? You werent going to be upgrading your CPU every year anyway, so... how does it even matter? (This may not apply to some uses - prosumer/pro creators definitely can upgrade every year, but generally, if they had to, they could easily absorb the cost of a new MoBo as well, as a 10% increase in performance means 10% more work completed at 100$+ an hour and it pays for itself rapidly).
Its basically a marketing line that AMD uses to make themselves appeaer more consumer friendly when in reality, there's no effective difference because users are simply not doing drop-in upgrades in high enough numbers to ever be relevant.
If you built an Alder Lake system now, the fact that you wouldn't be able to drop in an upgrade (maybe?) after Raptor Lake means... what? That Alder Lake system is going to last you 4-6 years anyway, no problem. So who cares.
- - - Updated - - -
Im not 100% on these (and a few people who replied, i have blocked) but AFAIK Zen 4 requires DDR5.