Page 2 of 3 FirstFirst
1
2
3
LastLast
  1. #21
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by BeepBoo View Post
    "A long time" is more like 2-3 years.
    Eh... no.
    Not at all. A GPU generation is like 2-3 years, and optimistic leaps like from the 900 to 1000 series, or 2000 to 3000 are ~30% uplift. Current GPUs aren't 30% from running all modern games at 4k120


    Not long at all. We got there with 1080, 1440, etc. We will get there with 4k at some point.
    At some point, yeah, but not in 2-3 years. *maybe* 2-3 generations, but likely more

    I'd rather just run lower settings or a lower res monitor instead until shit gets there as opposed to technology that tries to simulate those actual graphic settings by limiting it to "only places it's noticed."
    but what you want isn't the norm. Most people want the goodies now

    *Yet... it will. I'd rather they just focus on upping their core count and lowering their nm size instead of finding fake ways to hold-over.
    They're working on it, but transistor size is not up to Nvidia/AMD, that's up to the fabs. And we're nearing the limits of what silicon can even do, it's unlikely we'll get much below 1nm with current tech due to quantum tunneling.
    And "upping core count" doesn't mean much if transistor sizes don't shrink, since then you'd have to deal with significant delays for travel time, not to mention heat output and power requirements. Current GPUs are nearing how much power they can draw, simply due to how big coolers have gotten on them. The 3090 is a fucking ginormous beast, I wouldn't want a bigger one.

    AND even if all of that was in Nvidia's/AMDs wheelhouse.. Their hardware devs aren't the guys working on DLSS/Fidelity FX. The two fields are completely separate and have almost 0 overlap. Just because they're doing one doesn't mean they're not doing the other

  2. #22
    Quote Originally Posted by Temp name View Post
    Eh... no.
    Not at all. A GPU generation is like 2-3 years, and optimistic leaps like from the 900 to 1000 series, or 2000 to 3000 are ~30% uplift. Current GPUs aren't 30% from running all modern games at 4k120
    They're pretty damn close if you turn off AA (which.. the entire point of 4k is to no longer need aa due to pixel density). Most games I've seen by gamer nexus have 1% lows of over 60fps and avg fps of like 90-100. My assumption is that, yes, next gen will actually be 4k 120 avg fps capable for all games on ultra (minus AA, which... as I said... isn't needed any more).


    At some point, yeah, but not in 2-3 years. *maybe* 2-3 generations, but likely more

    but what you want isn't the norm. Most people want the goodies now
    fair, but apparently most people are also willing to get scalped, so... get fucked I guess? I'll happily wait.


    I wouldn't want a bigger one.
    We'll probably hit a stage where we either drastically shake up the system (quantum computing) or we go back to the drawing board, all current standards get chucked, and we up the "typical" size and power draw of a desktop to be larger than it currently is. Travel times of electricity won't be that noticeable I'd imagine when compared to the increased processing power of the cards.

    AND even if all of that was in Nvidia's/AMDs wheelhouse.. Their hardware devs aren't the guys working on DLSS/Fidelity FX. The two fields are completely separate and have almost 0 overlap. Just because they're doing one doesn't mean they're not doing the other
    Understandable, but I also feel like they should just perpetually push newer graphical fidelity technologies (like raytracing) as opposed to things designed to optimize those technologies.

    Ray tracing is fantastic. Super sampling is fantastic, high quality shadows were fantastic. I suppose shadows are probably how I envision all new "fancier" graphics fidelity stuff going:

    Basically no one could run high-res dynamic shadows when they first came out. They didn't develop hold-over technologies to fake it. They just said, "tough luck, wait till pleb tier cards get powerful enough to properly utilize the tech"

    I'd rather they just move on to developing whatever the next "big" thing that gets us closer to photorealism is. DLSS isn't a technology like that. DLSS is a way to simulate the actual tech of super sampling. Super sampling and anti-aliasing are on their death bed as it is, thanks to pixel density. I'd rather they give us something newer that gets us closer to photo-realism (like ray-tracing and shadows did).

  3. #23
    Quote Originally Posted by BeepBoo View Post
    I wish they would stop focusing on graphics gimmicks, most of which are designed to improve low-performance GPU FPS, and focus on just giving us better rasterization/$ prices.

    I wouldn't NEED dumb shit like DLSS
    You're incredibly wrong. DLSS is amazing and is basically free graphics/FPS.

    Even on a 1080p monitor you can upscale and then use DLSS.

    Upscaling to 4K and then using DLSS will always look better than your native 1080p.

    If they do it properly, expect a surge of RDR2 screenshots as people are able to play on 4k while also achieving decent FPS. I could already pull 35 ish FPS on High 4K with my RTX 2060 Super, with DLSS I expect 50+.
    Last edited by starstationprofm; 2021-06-09 at 05:53 PM.

  4. #24
    Quote Originally Posted by starstationprofm View Post
    You're incredibly wrong. DLSS is amazing and is basically free graphics/FPS.
    Even when he is wrong about DLSS, he ain't wrong about AMD and Nvidia not offering us true budget cards anymore seemingly. It's either cheap and underpowered or moderately expensive and capable. There is no real middle ground that the RX570 for example offered us. Granted silicon crysis and all, but still no mention of any plans for a sub 250 or 200 1080p card to get currently, unless you get something used or old stock.

    Maybe RX570 treated us too well with it's amazing price/performance, but we've stagnated since in that department. Granted we haven't seen a Radeon 6600 yet so it might go low, but I have my doubts seeing Nvidia priced their 3060 at high price already and a cut down 6700(or if navi23 is a thing) will smash it.

  5. #25
    Quote Originally Posted by starstationprofm View Post
    Upscaling to 4K and then using DLSS will always look better than your native 1080p.
    You could just use actual super sampling without DLSS and it would look better. If your machine is powerful enough, FPS will be the same. That's what I mean. I just want cards powerful enough to actually handle the technology behind it.

    If they do it properly, expect a surge of RDR2 screenshots as people are able to play on 4k while also achieving decent FPS. I could already pull 35 ish FPS on High 4K with my RTX 2060 Super, with DLSS I expect 50+.
    And with a 3080 you could get a solid 60fps without DLSS that would look superior.

  6. #26
    Quote Originally Posted by BeepBoo View Post
    You could just use actual super sampling without DLSS and it would look better. If your machine is powerful enough, FPS will be the same. That's what I mean. I just want cards powerful enough to actually handle the technology behind it.



    And with a 3080 you could get a solid 60fps without DLSS that would look superior.
    The GPU will work harder for that.

    And with a 3080 you could get a solid 60fps without DLSS that would look superior.
    Sure, but 3080s are a myth.

    - - - Updated - - -

    Quote Originally Posted by mrgreenthump View Post
    Even when he is wrong about DLSS, he ain't wrong about AMD and Nvidia not offering us true budget cards anymore seemingly. It's either cheap and underpowered or moderately expensive and capable. There is no real middle ground that the RX570 for example offered us. Granted silicon crysis and all, but still no mention of any plans for a sub 250 or 200 1080p card to get currently, unless you get something used or old stock.

    Maybe RX570 treated us too well with it's amazing price/performance, but we've stagnated since in that department. Granted we haven't seen a Radeon 6600 yet so it might go low, but I have my doubts seeing Nvidia priced their 3060 at high price already and a cut down 6700(or if navi23 is a thing) will smash it.
    The MSRP prices for the 3000 generation are more than fair. COVID caused a shortage of them, but that's not Nvidia's fault.

  7. #27
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by BeepBoo View Post
    They're pretty damn close if you turn off AA (which.. the entire point of 4k is to no longer need aa due to pixel density). Most games I've seen by gamer nexus have 1% lows of over 60fps and avg fps of like 90-100. My assumption is that, yes, next gen will actually be 4k 120 avg fps capable for all games on ultra (minus AA, which... as I said... isn't needed any more).
    Except GN don't really run that modern games, and especially don't run any of the graphically intensive ones in their tests. That much should be obvious by some of the games they test running into the 3-400fps range
    fair, but apparently most people are also willing to get scalped
    In what world do people want to get scalped? The choice is either to pay scalper prices, or don't get a GPU. If you need one, you're going to pay scalper prices.
    Understandable, but I also feel like they should just perpetually push newer graphical fidelity technologies (like raytracing) as opposed to things designed to optimize those technologies.
    But ray tracing isn't new. It's like, really far from new. You know Toy Story? That was rendered with ray tracing. You know, back in 96.

    What Nvidia have done is make GPU cores that are specifically designed to be able to do the calculations for ray tracing quickly, and bring it up to (almost) playable framerates.
    And again, the work required to make RTX work isn't the same as what you need to make DLSS work. They're both software, but they're different branches, with one requiring a lot of knowledge in AI

    - - - Updated - - -

    Quote Originally Posted by starstationprofm View Post
    The MSRP prices for the 3000 generation are more than fair.
    Except the 3080ti and 3090. Those are priced way too high for what they offer. Knock 3-400 dollars off of the MSRP and they'd be fine.
    Then again I don't know how expensive GDDR6X is, so that might be driving up cost a lot more than I think

  8. #28
    Quote Originally Posted by Temp name View Post
    But ray tracing isn't new. It's like, really far from new. You know Toy Story? That was rendered with ray tracing. You know, back in 96.
    This is something of a technically correct finnicky answer. Toy Story was very much not rendered in real time. Real time ray tracing is what people are referring to.

    Quote Originally Posted by Wikipedia
    Depending on its complexity, each frame took from 45 minutes up to 30 hours to render.
    I dunno about you, but I'd like more than a frame a day in complex scenes in my games.

    This is also not just a case of throwing more power at it, the technology is very different to the ray tracing done in animated films.

  9. #29
    Quote Originally Posted by mrgreenthump View Post
    Even when he is wrong about DLSS, he ain't wrong about AMD and Nvidia not offering us true budget cards anymore seemingly. It's either cheap and underpowered or moderately expensive and capable. There is no real middle ground that the RX570 for example offered us. Granted silicon crysis and all, but still no mention of any plans for a sub 250 or 200 1080p card to get currently, unless you get something used or old stock.

    Maybe RX570 treated us too well with it's amazing price/performance, but we've stagnated since in that department. Granted we haven't seen a Radeon 6600 yet so it might go low, but I have my doubts seeing Nvidia priced their 3060 at high price already and a cut down 6700(or if navi23 is a thing) will smash it.
    TBH the MSRPs of the 3060 and 3070 aren't so bad- not the killer price/performance of the RX570 of course, but a 3060 is pretty decent bang for your buck. Of course finding one at MSRP is a Herculean task and will be for a couple years, but that's not NVIDIA's fault.

    The MSRP for the 3080ti and 3090 are clearly bonkers insane, but NVIDIA knows that enthusiasts will pay damn near any price to get the best stuff anyway, even before scalping was a thing, what with the 3080ti having the same MSRP as the 2080ti.
    It is all that is left unsaid upon which tragedies are built -Kreia

    The internet: where to every action is opposed an unequal overreaction.

  10. #30
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by klogaroth View Post
    This is something of a technically correct finnicky answer. Toy Story was very much not rendered in real time. Real time ray tracing is what people are referring to.



    I dunno about you, but I'd like more than a frame a day in complex scenes in my games.

    This is also not just a case of throwing more power at it, the technology is very different to the ray tracing done in animated films.
    Did... Did you read literally the next sentence?

  11. #31
    Quote Originally Posted by Temp name View Post
    Did... Did you read literally the next sentence?
    Your next sentence still doesn't make bringing up the animation version of ray tracing relevant. It's still fundamentally different enough that calling it not a new technology doesn't really apply.

  12. #32
    Quote Originally Posted by Temp name View Post
    Then again I don't know how expensive GDDR6X is, so that might be driving up cost a lot more than I think
    If the memory was so expensive, it would have not made into the midrange (3070-Ti)?!

    But it is rumored that the 3090 will get discontinued.
    Reject 3090 DIEs going into the 3080-Ti so it might be also possible that NVIDIA is sitting on huge amounts of GDDR6X that did not made the higher frequency for the 3090 and they are just using it for 3080/3080-Ti and now 3070-Ti to get rid of it, because we are at 6+ months and thats usually the end of high-end-GPU demand, so it would make sense to get rid of unused high end parts around this time.
    -

  13. #33
    Quote Originally Posted by Temp name View Post
    Except GN don't really run that modern games, and especially don't run any of the graphically intensive ones in their tests. That much should be obvious by some of the games they test running into the 3-400fps range
    Just to clarify: GN uses RDR2, Total War 3 kingdoms, Cyberpunk 2077, Tomb Raider, etc. Not sure what other games exist out there in your world, but they definitely use a wide range of titles from normal to super intensive as far as gfx are concerned.

  14. #34
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by BeepBoo View Post
    Just to clarify: GN uses RDR2, Total War 3 kingdoms, Cyberpunk 2077, Tomb Raider, etc. Not sure what other games exist out there in your world, but they definitely use a wide range of titles from normal to super intensive as far as gfx are concerned.
    They don't use Cyberpunk reliably, and out of the rest of them, only RDR2 is really that graphically demanding.

    And even then, in their tests they don't reach anywhere near 4k 120fps. A 3080ti is listed as 94fps at 4k in RDR2. To reach 120 you'd need a 26% improvement in the next generation, which seems unlikely to me

  15. #35
    The Patient Sut's Avatar
    10+ Year Old Account
    Join Date
    May 2010
    Location
    United Kingdom
    Posts
    310
    It's nice the FSR will support older GPUs but we really have no idea who good it will really look. DLSS 2.0 on the other hand has been out in the wild for some time and it looks great

  16. #36
    Quote Originally Posted by Temp name View Post
    a 26% improvement in the next generation, which seems unlikely to me
    It isn't though. It's only the past 2 generations that Nvidia has gotten away with minimal upgrades. And we should be starting to see the transition to MCMs next gen, the highest end SKUs might just be 2x or 4x smaller chips, which let's you get an immense boost as we are reaching the end of what a monolithic die can do and how big it can be. Hell add 3D stacking into that as well, as it seems to be working really well.

    And you might say all that sounds expensive, yes it does. And probably is, but x80ti prices have been really high and they still sell, hell people bought the 3090 for gaming as well.

    Anyways this thread is starting go too far off topic now.

  17. #37
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,854
    Quote Originally Posted by mrgreenthump View Post
    And you might say all that sounds expensive, yes it does. And probably is, but x80ti prices have been really high and they still sell, hell people bought the 3090 for gaming as well.
    I've been religiously buying x80Ti level GPUs for years every 3-4 years, they were pricy always, especially here where we pay a good 50% markup as is. But what happens now is completely out of this world.

    The usual for me here was paying around $1500 USD for 80Ti GPUs, but now they cost more than $3k here. 3070Ti costs $1500 here now. I'm not buying 3070Ti for 1500 bucks and neither I will 3080Ti for $3k.

    So I guess I'll stick with 1080Ti I have for another 2 years or so, I don't mind, the old horse still has some juice left and I'll live without RTX/DLSS.

  18. #38
    Been noticing that many retailers in the UK (albiet still at a price hike) for the 6700XT have been in decent supply and the prices from the mark up is going down bit by bit.

    I think at least for AMDs side they seem to be getting stock on the shelves, seems like yes demand is really high for all gpus, but seems like Samsung can't produce enough of Nvidias GPUs and TSMC seems to be on track to still shift stuff out.

  19. #39
    Quote Originally Posted by Gaidax View Post
    I've been religiously buying x80Ti level GPUs for years every 3-4 years, they were pricy always, especially here where we pay a good 50% markup as is. But what happens now is completely out of this world.

    The usual for me here was paying around $1500 USD for 80Ti GPUs, but now they cost more than $3k here. 3070Ti costs $1500 here now. I'm not buying 3070Ti for 1500 bucks and neither I will 3080Ti for $3k.

    So I guess I'll stick with 1080Ti I have for another 2 years or so, I don't mind, the old horse still has some juice left and I'll live without RTX/DLSS.
    It would help if people remembered that names are irrelevant and it is the place in the product stack that matters. The 3080Ti’s place in the stack is the Halo product (as the 2080Ti was last gen). The 1080Ti was NOT the Halo product - the Titan Xp was. The 1080Ti was the penultimate/enthusiast card, which is the spot filled by the 2080 and 3080 in those stacks. If you look at the pricing (MSRP, ofc, current pricing is just chaos), the 1080Ti, 2080, and 3080 are all about the same price (699$ for all three).

    So its really more about the place in the stack, not the name. The names change. Especially if were talking about AMD. Stick to like-for-like comparisons within the product stack.

  20. #40
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Kagthul View Post
    It would help if people remembered that names are irrelevant and it is the place in the product stack that matters. The 3080Ti’s place in the stack is the Halo product (as the 2080Ti was last gen). The 1080Ti was NOT the Halo product - the Titan Xp was. The 1080Ti was the penultimate/enthusiast card, which is the spot filled by the 2080 and 3080 in those stacks. If you look at the pricing (MSRP, ofc, current pricing is just chaos), the 1080Ti, 2080, and 3080 are all about the same price (699$ for all three).

    So its really more about the place in the stack, not the name. The names change. Especially if were talking about AMD. Stick to like-for-like comparisons within the product stack.

    Well sort of yes and sort of no.

    RTX 3000 series:
    - Top chip is the GA102, used in 3090, 3080TI, 3080

    RTX 2000 series:
    - Top chip is the TU102, used in Titan RTX, 2080TI
    - 2080 uses smaller TU104 chip.

    GTX 1000 series:
    - Top chip is the GP102, used in Titan Xp, 1080TI
    - 1080 uses smaller GP104 chip.


    The 3080 is a bit of an anomaly in that it used the top die where as previous 2 generations used a smaller die, as a result the 3080 is also much closer in specs and performance to the 3080TI/3090 than either 1080 was to the 1080TI or the 2080 was to the 2080TI.

    In terms of place in the stack it's basically:

    Halo products: 3090/3080TI (very little difference), Titan RTX, Titan Xp
    Enthusiast products: 3080, 2080TI, 1080TI.

    It's really the RTX 2000 series which was absolutely awful in value, because Nvidia tried to capitalize on hyping up ray tracing and being "first" to market a product that could somewhat do it in real time.

    EDIT: Interestingly that makes the RTX 3070 (which uses the smaller GA104 chip) the replacement for the RTX 2080, which was the replacement for the GTX 1080.
    It does fit though as the performance increase from 1080 -> 2080 is ~35% and from 2080 -> 3070 it's ~30%.
    Last edited by Shakadam; 2021-06-14 at 12:24 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •