Eh... no.
Not at all. A GPU generation is like 2-3 years, and optimistic leaps like from the 900 to 1000 series, or 2000 to 3000 are ~30% uplift. Current GPUs aren't 30% from running all modern games at 4k120
At some point, yeah, but not in 2-3 years. *maybe* 2-3 generations, but likely moreNot long at all. We got there with 1080, 1440, etc. We will get there with 4k at some point.
but what you want isn't the norm. Most people want the goodies nowI'd rather just run lower settings or a lower res monitor instead until shit gets there as opposed to technology that tries to simulate those actual graphic settings by limiting it to "only places it's noticed."
They're working on it, but transistor size is not up to Nvidia/AMD, that's up to the fabs. And we're nearing the limits of what silicon can even do, it's unlikely we'll get much below 1nm with current tech due to quantum tunneling.*Yet... it will. I'd rather they just focus on upping their core count and lowering their nm size instead of finding fake ways to hold-over.
And "upping core count" doesn't mean much if transistor sizes don't shrink, since then you'd have to deal with significant delays for travel time, not to mention heat output and power requirements. Current GPUs are nearing how much power they can draw, simply due to how big coolers have gotten on them. The 3090 is a fucking ginormous beast, I wouldn't want a bigger one.
AND even if all of that was in Nvidia's/AMDs wheelhouse.. Their hardware devs aren't the guys working on DLSS/Fidelity FX. The two fields are completely separate and have almost 0 overlap. Just because they're doing one doesn't mean they're not doing the other