Page 11 of 12 FirstFirst ...
9
10
11
12
LastLast
  1. #201
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Kagthul View Post
    Also... why the fuck is everyone comparing FE prices (which are always 100+$ higher) to MSRP priced parts? Are you mentally challenged?
    Because there aren't any RTX cards available at the lower MSRP and there won't be in the coming months either. Just like how AMD's Vega was actually decent at it's MSRP, but it was impossible to buy at MSRP prices until around now, a year after release. MSRP is just a number, we care about reality. Maybe some models will start hitting MSRP pricing some time from now but that doesn't mean anything right now.

    Also, Nvidia has made 2 device ID's per card for this generation. One of them is for the FE cards and other factory OC'ed AIB cards, the other doesn't allow factory OC'ed cards and is most likely the one that's going to be used for cards aimed at the lower MSRP. That also obviously means worse binned chips and cheaper coolers so those cards won't be as fast as the higher priced ones.

    Quote Originally Posted by Kagthul View Post
    You can get a card that will outperform the 1080 for the same price a 1080 currently is (2070 MSRP is 500... same as 1080)
    We don't know the real performance of the 2070 yet. 2070 will be ~600$ at launch where as a 1080 is currently ~450$

    Quote Originally Posted by Kagthul View Post
    you can get a card that will perform the same as a 1080Ti for the same price that a 1080Ti costs (700$ MSRP)
    Again, a 2080 is currently ~800$ and a 1080TI is ~650$


    Quote Originally Posted by Kagthul View Post
    you can get a card that BURY the product it is replacing (2080Ti vs Titan Xp) for the same price as that product (1000 MSRP).
    And again, a 2080TI is currently ~1200$

    Quote Originally Posted by Kagthul View Post
    Oh, and all the new parts support newer features.
    Which don't actually work in anything yet and we don't know what the performance will be like.



    Look. I get your point but you're doing yourself a disservice by comparing MSRP to MSRP when neither number reflects reality. Maybe some months from now the situation will change and then we can re-evaluate thing but that's then and now is now.

  2. #202
    Quote Originally Posted by dadev View Post
    Well, software support is going to be soon, however right now it’s not used in central features. And this also ties in to what I wrote before: this is not a purchase advice or anything of the sort, just a clarification if you think this is ignored by all developers.

    EDIT: If you want to extrapolate when having gpu accelerated ray tracing is going to be crucial, you can take something similar. For instance geforce3 introduced standardized programmable shaders (which was a change of similar magnitude). That was released in 2001, so just look when games started using dx9 (it still had fixed function, so exact measurement is going to be hard). You can also check when games started requiring it, then you'll know how much time it took from introduction until full adoption.
    And guess what? It wasn't until the Geforce 4 that followed before it was worth a damn. How long did it take before we got any decent DX12 engines?

    Quote Originally Posted by Thunderball View Post
    I have no idea why people consider 7nm to be some kind of saviour. Current 7nm implementations are not much different from 14nm when it comes to density, and wont offer much performance uplift. It might offer up to 30% power efficiency but an architecture redesign is required to translate that to performance, and noone is doing that currently.
    AMD is going to be using TSMC's node next year. Nvidia will have to follow. Intel is a wildcard here.

    With the 1000 series dragging on for longer than normal, the top end 2000 card being available at launch, developers apparently not getting their hands on it until literally a week or two before Gamescon, drivers that seem a bit undercooked, the GTX/RTX 2000 series has the feeling of something Nvidia rushed to market to try and recoup some addition costs of it's HPC chip from consumers before the rush to the next node.

    Also with 30% power reduction that means they'd be able to throw more die space at getting something better than 1080p60 with the top end chip. Something that might appear on a chip that normal people could buy.

    Upon further investigation, I'm wondering where you're getting your information. TSMC claims

    When compared to the CLN16FF+ technology (TSMC’s most widely used FinFET process technology) the CLN7FF will enable chip designers to shrink their die sizes by 70% (at the same transistor count), drop power consumption by 60%, or increase frequency by 30% (at the same complexity).
    Last edited by Cows For Life; 2018-09-20 at 05:57 PM.

  3. #203

  4. #204
    Quote Originally Posted by Cows For Life View Post
    And guess what? It wasn't until the Geforce 4 that followed before it was worth a damn. How long did it take before we got any decent DX12 engines?
    For end users? Mostly yes. And with that extrapolation game this is the direction I was pointing to. But still, geforce3 was a major developement, and if you were at least somehow involved in making game engines you had to have one, because programmable shaders were all the rage.

    Anyway, as I wrote before, I'm not telling you to buy (because cool tech) or not to buy (because expensive). Do whatever you think is right. What I'm saying is that just like in the case of geforce3, this generation isn't going unnoticed by developers.
    For long term, just like there are no engines today that are relying on fixed function, rasterization will ultimately go away in a similar fashion. This is not going to be tomorrow, next year, or probably even in five. But somewhere 10-15 years from now I think ray tracing will begin to overtake rasterization entirely for realtime applications. I know that end users don't care about something so far off, but for developers the road starts now, because it's not that we wake up in 10 years and "woah! everything is ray tracing", it's going to be a long road with slow progress.

    As for DX12, its adoption is directly tied to Win10 adoption. While in the western part of the world it's pretty high, when you consider the eastern part you realize that dx11 is going to stay for a long time. Targeting both is painful and performance is going to suffer on one of them. But if you notice, game engines that targeted only dx12 (or vulkan) fared way better. There were only few, but still.

  5. #205
    Quote Originally Posted by Cows For Life View Post
    Upon further investigation, I'm wondering where you're getting your information. TSMC claims
    The devil is in the details I guess.

    At the same complexity: So they can die shrink their current cards for a straight up 30% increase which is not quite as straight forward as it sounds, or they can increase transistor counts and probably wind up with an incremental increase as has been the norm for maybe 15 years. I would not expect miracles.
    The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts.

  6. #206
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by dadev View Post
    EDIT: If you want to extrapolate when having gpu accelerated ray tracing is going to be crucial, you can take something similar. For instance geforce3 introduced standardized programmable shaders (which was a change of similar magnitude). That was released in 2001, so just look when games started using dx9 (it still had fixed function, so exact measurement is going to be hard). You can also check when games started requiring it, then you'll know how much time it took from introduction until full adoption.
    The Geforce 3 and 4 are kinda a sad story cause after the Geforce FX cards the 3's and 4's were essentially worthless, because Nvidia pushed developers to use DX8.1 cause FX cards couldn't properly do DX9 at full speed. And it took until Half Life 2 to properly use DX9 that it showed the real problem with the FX cards. But you know who could play those DX8.1 games? The slower Radeon 8500, cause it supported DX8.1. To make matters worse, both Nvidia and ATI introduced their own version of DX9. DX9 was now DX9.0a while ATI's version was DX9.0b, and Nvidia had DX9.0c. So the FX cards and all the DX9 ATI cards couldn't do DX9.0c, which because the industry standard from that point forward was 9.0c. Bioshock was the most infamous for this where people tried to patch it to work on Radeon x700 and x800 cards.

    Interestingly enough the Geforce 3 carried with it the T&L core for DX7 while I believe the Radeon 9xxx series had that T&L built into the pixel and vertex cores. Not sure if the Geforce 4 or the FX cards had this merged as well. But it might be something to expect in future RTX cards where Nvidia might do the same with their AI and RT cores.


  7. #207
    Warchief skannerz22's Avatar
    10+ Year Old Account
    Join Date
    Mar 2011
    Location
    Bunnings Warehouse
    Posts
    2,050
    Quote Originally Posted by pansertjald View Post
    Looks like the new GFX card line from Nvidia is gonna be called 20xx and the 2070/2080 will be RTX indted of GTX, because of Raytracing. Only the RTX 2070/2080 will have Raytracing. The 2060/2050 will be called GTX because of no Raytracing.

    Looking at the specs we are looking at a 50% increase from GTX 1080 to RTX 2080 and the RTX 2070 40% increase from GTX 1070

    The prices looks to be okay

    nvidia registers turing geforce RTX and quadro RTX trademarks
    what about 1080ti vs 2080

    we need heaven benchmark proof before buying it

    770 gtx 4gig still does better than 1060 6gig in heaven
    -Proffesional Necromancer-

  8. #208
    Quote Originally Posted by skannerz22 View Post
    what about 1080ti vs 2080

    we need heaven benchmark proof before buying it

    770 gtx 4gig still does better than 1060 6gig in heaven
    mite do better in a benchmark but if it does worse in games then thats a fukin irelevant benchmark aint it i swear 2 christ
    Last edited by mauserr; 2018-09-23 at 04:10 PM.

  9. #209
    Quote Originally Posted by Cows For Life View Post
    AMD is going to be using TSMC's node next year. Nvidia will have to follow. Intel is a wildcard here.

    With the 1000 series dragging on for longer than normal, the top end 2000 card being available at launch, developers apparently not getting their hands on it until literally a week or two before Gamescon, drivers that seem a bit undercooked, the GTX/RTX 2000 series has the feeling of something Nvidia rushed to market to try and recoup some addition costs of it's HPC chip from consumers before the rush to the next node.

    Also with 30% power reduction that means they'd be able to throw more die space at getting something better than 1080p60 with the top end chip. Something that might appear on a chip that normal people could buy.

    Upon further investigation, I'm wondering where you're getting your information. TSMC claims
    Yeah, that's TSMC 7nm - which is pretty close to the same deal as Intel 10nm, chips produced with that tech were first made in 2014, but it's still very low yields. AMD banked on the GF 7nm, which was shaping to be nowhere the same deal as TSMC 7nm, basically just a huge 14nm shrink. Now that GF 7nm is pretty much cancelled (they are likely gonna redesign it from scratch) I dont see 7nm mass production rolling out this year or next. Yeah, TSMC makes new A12 for Apple, gonna make some professional GPUs for AMD most likely this year (or early next year) but it doesnt change the whole picture - we're still pretty far from mainstream targeted chips being made on that node. Intel produces 10nm based products right now aswell but we all know that consumer chips are at least 1.5 years away.

    Yeah, AMD can release 7nm products faster but they werent targeting TSMC 7nm, all the design work (that's assuming there was any design work, as far as I'm concerned AMD doesnt have anyone to design GPUs right now) have been done, they have to redesign the chips to take advantage of TSMC's technology.

    Also, I dont get why people think that RTX cards are rushed. Nvidia had zero reason to rush them, they also had zero reason to release better cards, they are still gonna be ahead with 1080Ti probably even by the end of next year. RTX cards have an extremely small target audience, I dont get why people even care about them, most people do not have 4K monitors.
    Last edited by Thunderball; 2018-09-21 at 03:42 PM.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  10. #210
    Ill grab a second 1080 TI for my main rig once prices fall a little more before buying one of these things. I will get a MUCH bigger performace gain from that than if I even replaced my Ti with the 2080 ti

  11. #211
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Moozart View Post
    Ill grab a second 1080 TI for my main rig once prices fall a little more before buying one of these things. I will get a MUCH bigger performace gain from that than if I even replaced my Ti with the 2080 ti
    Honestly I'm not sure.. SLI doesn't work that well, and the 2080ti is a decent step up..

    That said you'll probably get better value from it

  12. #212
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Temp name View Post
    Honestly I'm not sure.. SLI doesn't work that well, and the 2080ti is a decent step up..

    That said you'll probably get better value from it
    2080Ti is an upgrade, but the price for one is approaching you needing to take a loan to buy one. Titan like prices with the name of a mainstream product. The 1080 Ti to me is still extremely expensive, but you can get one for less than $600. The 2080 Ti is well over $1k. I wouldn't buy two, but it does make more sense than buying one 2080 Ti. Even though SLI is kinda like playing the lottery, you will probably get more out of two 1080 Ti's than one 2080 Ti. The RTX cards are built for Ray-Tracing, which is something you won't see in many games until a mainstream product can do it.

  13. #213
    Herald of the Titans pansertjald's Avatar
    15+ Year Old Account
    Join Date
    Jul 2008
    Location
    Denmark
    Posts
    2,500
    Quote Originally Posted by Kagthul View Post
    ...

    for fucks sake.

    how many times does it have to be re-iterated that the 2080Ti

    IS. NOT. FUCKING. REPLACING. THE. 1080Ti.

    It is replacing the Titan. That is why it is 1000$.

    Its like nVidia even released a press release when Titan V came out that Titan was no longer part of the consumer stack and was now a separate product line.

    They also said (in the spring) that they wanted to separate the halo product (in this case, the 2080Ti) from the pentultimate product (in this case, the 2080), so that your extra money actually bought you more performance... because previously the halo product (Titan) was within the margin of error faster than the penultimate product (x80Ti) and cost hundreds more for no particularly good reason.

    So it was inevitable that the 2080 wasn't going to be some kind of miracle.

    What everyone ALSO seems to have gotten fucking amnesia about is that giant jumps (like from 700 > 900, and 900 > 1000) were NEVER the norm. Previously, y ou were lucky to get 30-35% performance uplift per generation. You never upgraded every product cycle unless you were an enthusiast with money to burn. Why people expected giant jumps to continue being the norm or upgrading every product cycle instead of every other or even every third to remain i have no idea. People deluding themselves, i guess.

    Also... why the fuck is everyone comparing FE prices (which are always 100+$ higher) to MSRP priced parts? Are you mentally challenged?

    End of the day:

    You can get a card that will outperform the 1080 for the same price a 1080 currently is (2070 MSRP is 500... same as 1080)
    you can get a card that will perform the same as a 1080Ti for the same price that a 1080Ti costs (700$ MSRP)
    you can get a card that BURY the product it is replacing (2080Ti vs Titan Xp) for the same price as that product (1000 MSRP).


    Oh, and all the new parts support newer features.

    You're acting like because they didn't come out with giant performance jumps that its some kind of agregious sin. All they did was do owners of older hardware a favor.... you dont have to upgrade this generation if you have a 1080 or 1080Ti (or Titan Xp). Seems fine to me.
    You really need to stop smoking herbs.
    AMD Ryzen 7 7800X3D: Gigabyte X670 Aorus Elite AX: G.Skill Trident Z5 Neo RGB DDR5-6000 C30 : PowerColor Radeon RX 7900 GRE Hellhound OC: CORSAIR HX850i: Samsung 960 EVO 250GB NVMe: fiio e10k: lian-li pc-o11 dynamic XL:

  14. #214
    These prices are the death of PC gaming... you are better off buying a PlayStation 5 for $499 (my guess on the initial price) than these graphics cards.
    Last edited by Amalaric; 2018-10-04 at 10:48 PM.
    "Every country has the government it deserves."
    Joseph de Maistre (1753 – 1821)


  15. #215
    Quote Originally Posted by Amalaric View Post
    These prices are the death of PC gaming...
    This again?
    The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts.

  16. #216
    Deleted
    The new cards don't change anything for those with any form of budget restraint. I suppose if anything, if you like what ray tracing will bring in 4 or 4 years, then get a 1440p monitor rather than 4k, as even then nothing will drive 4k RT. DLSS doesn't seem to get you much that selecting a slightly lower resolution doesn't.

    If I was to make a prediction it will be that the prices will drop duly but that multi-gpu setups will become more common again. The lower level API's make AFR/SFR with combined memcap/bandwidth a thing and easier to accomplish with good scaling. Nvidia and it's driver army may end up doing a fair chunk of the work in the early days, but hey, now it's worth buying two GPU's

  17. #217
    Quote Originally Posted by Amalaric View Post
    These prices are the death of PC gaming... you are better off buying a PlayStation 5 for $499 (my guess on the initial price) than these graphics cards.
    ??? Prices go down. Expensive enthusiast cards were never for the common consumer nor have they "killed" PC gaming.
    The wise wolf who's pride is her wisdom isn't so sharp as drunk.

  18. #218
    Banned Strawberry's Avatar
    15+ Year Old Account
    Join Date
    Jul 2007
    Location
    Sweden/Yugoslavia
    Posts
    3,752
    Well, shit, this will be the first Nvidia high end card I will skip since 2004 or something when I bought 6800 Ultra.
    The problem with 2080Ti is the existence of 1080Ti, which is a 4k/60hz card and 4k/120hz monitors are quite hard to get a hold of (and I don't game on anything less than 40"). There's basically no point in getting 2080Ti if you own a 1080Ti.
    I'd rather choose 4k RTX off than 1080p RTX on.
    The price is not the issue if we're getting top notch, but I feel like 2080Ti is just a renamed Titan but with lackluster performance. There, I said it.

  19. #219
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Strawberry View Post
    Well, shit, this will be the first Nvidia high end card I will skip since 2004 or something when I bought 6800 Ultra.
    The problem with 2080Ti is the existence of 1080Ti, which is a 4k/60hz card and 4k/120hz monitors are quite hard to get a hold of (and I don't game on anything less than 40"). There's basically no point in getting 2080Ti if you own a 1080Ti.
    I'd rather choose 4k RTX off than 1080p RTX on.
    The price is not the issue if we're getting top notch, but I feel like 2080Ti is just a renamed Titan but with lackluster performance. There, I said it.
    I mean, the 2080ti is still a 20-30% upgrade in normal games over the 1080ti.
    Definitely not worth the 1000USD price tag though

  20. #220
    Quote Originally Posted by Temp name View Post
    I mean, the 2080ti is still a 20-30% upgrade in normal games over the 1080ti.
    Definitely not worth the 1000USD price tag though
    Aye, at best 30% atm, for 70% more to buy it, not a good indication of performance to dollar and it would never ever even be worth it for Raytracing, since you gimp your fps for a bit of eye candy, which over the years has been cheated to look great already for what it done now.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •