Page 17 of 37 FirstFirst ...
7
15
16
17
18
19
27
... LastLast
  1. #321
    Pit Lord Advent's Avatar
    Join Date
    Oct 2008
    Location
    The Other Side.
    Posts
    2,462
    Quote Originally Posted by CostinR View Post
    Both RTX and DLSS are pretty important, not for right now but give it a few years and they'll vastly improve in terms of what they offer.

    Ray tracing is going to be the go-to rendering method in a decade or so and it will save game developers a lot of time.
    Then maybe we'll worry about it in a decade. Gpus are already ludicrously expensive, and I don't see that changing. Ever.

  2. #322
    Quote Originally Posted by Hoofey View Post

    Can I ask why you want more VRAM?
    Photo/video editing. Maybe ML. Why would anyone want more VRAM otherwise? :P
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

  3. #323
    The geek in me wants new hardware and working through lockdown and doing lots of overtime I can afford RTX 3090...but I can't justify upgrading from my current 1070 considering the games I play...but shiny things...must resist
    Quote Originally Posted by Venant View Post
    I think many people will agree that genocide can be justified.

  4. #324
    Quote Originally Posted by kaelleria View Post
    So... AMD has completely fired the marketing team that pumped up Vega. The 5700 xt is a great value card.

    Intel hired that marketing team and we've seen what's happened. Don't believe anything intel is saying.

    All we have to do is wait a month or so to see if my information is accurate. I'm getting it from the same spot that had accurate literally everything correct from the Nvidia event. You too can do some research and see where I'm getting it.

    The fact of the matter is all this will do is make the GPU market more competitive. AMD still needs to improve their software offerings for them to be worth it for me... I'm just excited we're going competitive for the next few years.

    Oh and we'll be getting a 4000 series from Nvidia and a 7000 series from AMD within the next year and a half or two.
    AMD fanboy detected.

    - - - Updated - - -

    Quote Originally Posted by Thoriangun View Post
    This happens all of the damn time in motorsports, take a look at F1 recently.
    So you're saying that Lisa Su came to Jensen and sold him that info?

    - - - Updated - - -

    Quote Originally Posted by msdos View Post
    The RTX and DLSS gains are meaningless to me and DLSS looks like poop, so much noise and artifacting.
    Couldnt be more wrong.
    i7-6700K @ 4.6GHz cooled by Thermalright Silver Arrow IB-E Extreme | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II / Samson SR850 | BenQ XL2411T + LG 24MK430H-B

  5. #325
    Quote Originally Posted by Vegas82 View Post
    Yeah, at $700 I’m honestly thinking about getting one with a third party cooler asap.
    Is applying a third party cooler on a GPU easy?

  6. #326
    Quote Originally Posted by Manu9 View Post
    Is applying a third party cooler on a GPU easy?
    The GPU dies are much less protected, they need much more care with thermal paste and heatsink preasure is even more important.

    I highly doubt that FE or current FE cooling replacements will be easy or even possible without damaging the GPU.

    AIB GPUs that are made for easier replacements are another thing, but with the late start of the GPU production (coolers are only manufactured since august), I doubt that there are enough high binned chips for AIBs and you either getting a FE with fixed cooling or a lower graded bin with AIB.

    Third party coolers might just offset the bin quality so early in the production, because NVIDIA will keep the highest grade for FE just as the last 2 generations.

    The bad cooling design (gamer style) was the FE cards biggest issue, since you got good bins but the cooling was the worst possible you could get. This might change with the 3000 series and AIB cards will have a much harder time to justifiy the higher prices.
    Last edited by Ange; 2020-09-07 at 12:03 PM.
    hidden information WoWArmory | Raider.IO | WoWProg | logs Logs1 | Logs2 | Logs3

  7. #327
    Quote Originally Posted by Coldkil View Post
    Ok, bit a speculation/reasoning here that i derived from the leaks and confirmations.

    Imho the 3090 is the Titan. Like, a straight out name replacement for a product that was "too much" for gaming and not really aimed towards workstations. With the name change, it gets someway more "identity" as the superduper powerful "want all, have all" gaming GPU. It's like the Titan something very few people will actually get, and we all will drool at the incredible performances. Just a marketing move.

    As for the memory: if we think at the standard PCB layout is the GPU in the middle and 4 memory modules on three sides. Lower end cards have the empty slots around the GPU, and even stuff like the 1080ti with 11GB has an "empty spot" for an eventually additional memory chip.

    So, the 3090 has 24GB because it uses "double side" memory. It makes sense with also how the cooler is designed - the back fan is attached to a second vapor chamber/heatpipes so it can cool the memory on the back of the pcb and iirc it's been shown in some leaked images. 24GB is simply 12GBx2 (all slots filled by 1GB modules on each side). You can see in this leaked image (so all precautions taken) how the PCB looks like. EDIT: if the image is true, it's 100% not a FE card. It still has the 3x8pin configuration and you can see the cooler on the upper left being a standard one.
    The big price jump from 3080 to 3090 is "justified" since the cooler is way more costly to make if it needs to be done this way.

    As per the 3080 having only 10GB, again i think it's a marketing move to leave room for a Ti/Super model - so they can easily add 2GB more and make it more powerful. I assume that they're going to "move down the ladder one step" like with the 2000 serie, as the 2070 Super has actually the GPU of a 2080; so, to make a 3080 Super different enough, they're gonna add more memory.

    At least this makes sense in my head
    I am ready. Ill be moving my 1080ti to my backup system and getting this 3080. I cant wait.

  8. #328
    Quote Originally Posted by Moozart View Post
    I am ready. Ill be moving my 1080ti to my backup system and getting this 3080. I cant wait.
    Also worth noting, the latest leaks showed a 16gb "3070ti" and a 20gb "3080ti" which make again sense both to cover the middle price ranges (especially between 3080 and 3090) and this corroborates with the "dual sided memory chips" that the 3090 is equipping.

    I don't think 20gb will be necessary for gaming, especially since now i am also on 1080ti and i'll stay on 1440p with no interest for 4k. 3080 will be more than fine and a good buy for my system. I'll wait to see how stocks will be and which card is actually the better performing. I like the Strix design.
    No one wants to choose. Everyone wants everything.

  9. #329
    My system is https://www.asus.com/us/Motherboards/Z170-A/ and https://ark.intel.com/content/www/us...-4-20-ghz.html ( no OC) and i just hope i dont end up bottlenecking the new GPUs scare me (right now i have https://www.asus.com/Graphics-Cards/TURBO-GTX1060-6G/ and i wouldnt mind a nice upgarde)

  10. #330
    No?
    I'm nowhere near needing 4k or 8k.

  11. #331
    Quote Originally Posted by Zigrifid View Post
    My system is https://www.asus.com/us/Motherboards/Z170-A/ and https://ark.intel.com/content/www/us...-4-20-ghz.html ( no OC) and i just hope i dont end up bottlenecking the new GPUs scare me (right now i have https://www.asus.com/Graphics-Cards/TURBO-GTX1060-6G/ and i wouldnt mind a nice upgarde)
    You will bottleneck. 3070, 3080 or 3090 regardless. I just dont get the logic of upgrading to something that far faster than 2080Ti if you have a 1060.
    i7-6700K @ 4.6GHz cooled by Thermalright Silver Arrow IB-E Extreme | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II / Samson SR850 | BenQ XL2411T + LG 24MK430H-B

  12. #332
    Quote Originally Posted by Thunderball View Post
    You will bottleneck. 3070, 3080 or 3090 regardless. I just dont get the logic of upgrading to something that far faster than 2080Ti if you have a 1060.
    Upgrading makes sense, though as you said, it's simply not worth going for even a 3070 - maybe 3060 is the best bet if you want more performrace and i doubt that will bottleneck that system. It's always the same reasoning - do not mismatch components. It's not that they won't work, but it's not efficient/smart to couple parts from different tiers.

    This time, if benchmarks show the things nvidia said, it'll be a "generational" change. Happens from time to time, sometimes you make a machine that lasts 8+ years with small upgrades because tech doesn't jump that far, then suddendly something changes the standards.
    No one wants to choose. Everyone wants everything.

  13. #333
    Quote Originally Posted by Thunderball View Post
    AMD fanboy detected.

    - - - Updated - - -



    So you're saying that Lisa Su came to Jensen and sold him that info?

    - - - Updated - - -



    Couldnt be more wrong.
    If you can answer me this, why is your VERY specific comment gone right to the CEOs of these companies?

    Why is it not within their company among their workforce? Why is your statement using a silly point to something that happens in the industry for both sectors actually.

    Are you actually convinced information is not leaked between companies? are you actually convinced by this?

  14. #334
    Quote Originally Posted by Ange View Post
    The bad cooling design (gamer style) was the FE cards biggest issue, since you got good bins but the cooling was the worst possible you could get. This might change with the 3000 series and AIB cards will have a much harder time to justifiy the higher prices.
    This is one of the things I'm most interested in when it comes to the reviews. Theres a lot of metal on those FE coolers, so will be interesting to see how they hold up to the AIB models.

  15. #335
    The last Radeon GPU I bought was 16 years ago (before AMD owned Radeon)... I guess that means I'm an AMD fangirl.

  16. #336
    Quote Originally Posted by Thunderball View Post
    AMD fanboy detected.
    Is that your only takeaway from that post?

    Quote Originally Posted by Thunderball View Post
    So you're saying that Lisa Su came to Jensen and sold him that info?
    Starting to get a bit edgy in here.

    Pretty sure the minds at Nvidia are able to extrapolate from the info we have from both consoles and RDNA 2, mixed up with a few sources and rumors, and draw up some rough estimates of where AMD will land in terms of performance on their cards. Stuff like this happens all over the place, especially in a competitive setting.

    Nvidia got both Sony and Microsoft as potential sources as well, so shouldnt be too hard for em to get some numbers.

  17. #337
    Quote Originally Posted by Thoriangun View Post
    Are you actually convinced information is not leaked between companies? are you actually convinced by this?
    I'm 100% convinced that it's happening, but it's not worth for a company that has a huge lead in the space to bother and pay someone to leak information about the competition to set the prices on launch before the competition even launched and has an opportunity to adjust their pricing. This has nothing in common with F1, aswell. This would be similar to Mercedes paying Willams to reveal their performance figures.

    - - - Updated - - -

    Quote Originally Posted by Hoofey View Post
    Starting to get a bit edgy in here.

    Pretty sure the minds at Nvidia are able to extrapolate from the info we have from both consoles and RDNA 2, mixed up with a few sources and rumors, and draw up some rough estimates of where AMD will land in terms of performance on their cards. Stuff like this happens all over the place, especially in a competitive setting.

    Nvidia got both Sony and Microsoft as potential sources as well, so shouldnt be too hard for em to get some numbers.
    That's based on the example the guy brought up: F1. Where one team has recently bought the design for their competitor's last year's car. I'm also pretty sure that Nvidia has no reason to pay someone to get information like this, at least not someone in AMD.
    i7-6700K @ 4.6GHz cooled by Thermalright Silver Arrow IB-E Extreme | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II / Samson SR850 | BenQ XL2411T + LG 24MK430H-B

  18. #338
    Quote Originally Posted by Thunderball View Post
    I'm 100% convinced that it's happening, but it's not worth for a company that has a huge lead in the space to bother and pay someone to leak information about the competition to set the prices on launch before the competition even launched and has an opportunity to adjust their pricing. This has nothing in common with F1, aswell. This would be similar to Mercedes paying Willams to reveal their performance figures.

    - - - Updated - - -



    That's based on the example the guy brought up: F1. Where one team has recently bought the design for their competitor's last year's car. I'm also pretty sure that Nvidia has no reason to pay someone to get information like this, at least not someone in AMD.
    Yeah exactly, Nvidia has no reason to pay people at AMD for information, because the information is already out there in form of leaks. We've had Big Navi rumors since before the console info hit, so theres no need for Nvidia to use some shady tactic to get the info they need.

    They probably had a good idea on where RDNA 2 would land performance wise, so they went aggressive on pricing in order to beat em to the punch, and steal the spotlight from the consoles at the same time.
    And based on Nvidias price/performance numbers and die selections for the cards, they seem to expect competition in the 3070-3080 market, but not for the 3090.

    Latest RDNA 2 rumors have a 600$ card placed somewhere between 3070 and 3080, but nothing for 3090. Also some talk about AMD having the capabilities to match a 3090, but we might not see something like that just yet. Will be interesting to see if all these rumors have some truth to em.

  19. #339
    Please wait Temp name's Avatar
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    13,095
    Quote Originally Posted by Hoofey View Post
    Latest RDNA 2 rumors have a 600$ card placed somewhere between 3070 and 3080, but nothing for 3090. Also some talk about AMD having the capabilities to match a 3090, but we might not see something like that just yet. Will be interesting to see if all these rumors have some truth to em.
    When was the last time AMD had something to compete with Nvidia at the top end? 290x?

    Halo products are good for grabbing eye balls, but awful for making money.

  20. #340
    Titan
    Join Date
    Oct 2010
    Location
    America's Hat
    Posts
    12,902
    I am going to wait while AMD unveils their next gen video cards. I'm one of those people who wants the best bang for their buck, and right now Nvidia looks pretty good with that RTX 3070 being a huge upgrade over my current ROG GTX 1070. But even as is, the GTX 1070 is performing really well with newer games and I'll be interested to see what games like Cyberpunk 2077 do to my performance.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •