Page 39 of 49 FirstFirst ...
29
37
38
39
40
41
... LastLast
  1. #761
    Quote Originally Posted by mrgreenthump View Post
    Yeah, no it isn't. It uses Geforce drivers and is a Geforce card. Titans use professional drivers specific to them.
    No, they dont. They use the same kind of driver other GeForce cards use.

    Quote Originally Posted by mrgreenthump View Post
    Well ye that's the rumor. That they are preparing to release a 9984 CUDA core GA102 with 12 gigs of memory. Which is barely cut down 3090 and really they should release it.. 6900XT is so close and is $500 cheaper. So matching AMD with a bit lower price would kill the 6900XT.. Unless AIBs find a way to clock the 6900XT higher with a lot more power.. Afterall there is a rumor of an Asus 6800XT hitting above 2.5GHz boost clocks.

    3090 will still remain for those that truly need the extra VRAM.
    Yea, rumors are stupid sometimes.

    - - - Updated - - -

    Quote Originally Posted by Chult View Post
    AMD looks to have a strong lineup this time, which is awesome. Competition is great, and the end result is better options for everyone and often more reasonable prices. I'm reserving judgement until benchmarks though, as just because something looks great on paper doesn't mean it necessarily translates directly to real world results.
    I dont think prices are that unreasonable this time, bar the 3090, which is supposed to be prohibitively expensive. AMD is not an option for me and a lot of people because of drivers and no CUDA (for the lack of alternative).
    i7-6700K @ 4.6GHz cooled by Thermalright Silver Arrow IB-E Extreme | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II / Samson SR850 | BenQ XL2411T + LG 24MK430H-B

  2. #762
    Titan
    Join Date
    Oct 2010
    Location
    America's Hat
    Posts
    13,040
    Quote Originally Posted by Uurdz View Post
    I bought a Gsync monitor a while back, its good. Love it. But it cost basically double what other monitors were going for and now if I switch to AMD from NVIDIA its a waste.
    Uhhh, both work interchangably...

  3. #763
    Please wait Temp name's Avatar
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    13,463
    Quote Originally Posted by Rennadrel View Post
    Uhhh, both work interchangably...
    AMD GPUs don't worth with G-sync.
    G-sync is a specialized circuit board made by Nvidia that's integrated into the monitors.

    There are G-sync compatible displays that don't have that, but those aren't G-sync monitors

  4. #764
    Apparently running Godfall on 4k ultra textures requires 12GB of vram. So you need to downgrade the textures on many GPUs. Then again this is AMD supported title so hopefully it's the odd one out and not a trend for next gen games.

  5. #765
    You can bet that next gen games will be using whole boatloads more VRAM, simply because new consoles allow this.

    That's my no.1 reason why I did not even spit on 3080 with its laughable 10GB VRAM - it's simply not going to hold my intended 4 years cycle for GPU. I hope Nvidia will ship inevitable 3080Ti with 12GB VRAM (makes sense configuration-wise), because I completely expect select games down the road having the capability to fill this buffer.

    And even that is frikkin' stingy for what will be a smidge shy of 1 grand GPU.
    Last edited by Gaidax; 2020-11-03 at 06:20 PM.

  6. #766
    "use" VRAM (fill up) =/= actually requires that much physical VRAM to run well

    10-12 will be plenty for 4K, and even less will be needed with RTX I/O

  7. #767
    The rumors be plenty. https://twitter.com/kopite7kimi/stat...85556417863680 seems to be the new candidate for a 3080ti spec. Which would make it pretty much a 3090 in gaming.

  8. #768
    Please wait Temp name's Avatar
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    13,463
    Quote Originally Posted by mrgreenthump View Post
    The rumors be plenty. https://twitter.com/kopite7kimi/stat...85556417863680 seems to be the new candidate for a 3080ti spec. Which would make it pretty much a 3090 in gaming.
    Haven't Nvidia already confirmed they've scrapped their plans for a 3080ti though?

  9. #769
    The Lightbringer Shakadam's Avatar
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,206
    Quote Originally Posted by Life-Binder View Post
    "use" VRAM (fill up) =/= actually requires that much physical VRAM to run well

    10-12 will be plenty for 4K, and even less will be needed with RTX I/O
    12Gb probably, I have my doubts about 10GB.

    RTX I/O is basically vaporware before it's even launched. MS DirectStorage will fill the same function and have much broader availability. I don't imagine any developer will have any reason to use RTX I/O.

  10. #770
    Quote Originally Posted by Temp name View Post
    Haven't Nvidia already confirmed they've scrapped their plans for a 3080ti though?
    No, just the plans for a 3080 20GB(not the same thing as it would have the same performance just double the vram).

    - - - Updated - - -

    Quote Originally Posted by Shakadam View Post
    12Gb probably, I have my doubts about 10GB.

    RTX I/O is basically vaporware before it's even launched. MS DirectStorage will fill the same function and have much broader availability. I don't imagine any developer will have any reason to use RTX I/O.
    You mean aside from a deal with Nvidia or the like right?

  11. #771
    Please wait Temp name's Avatar
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    13,463
    Quote Originally Posted by Vegas82 View Post
    No, just the plans for a 3080 20GB(not the same thing as it would have the same performance just double the vram).
    Ah, I just kinda assumed they'd be the same thing.

  12. #772
    Quote Originally Posted by Temp name View Post
    Ah, I just kinda assumed they'd be the same thing.
    People are thinking the ti will be a mildly stripped down 3090. I’ll be happy with a 3080 if I can get one. Got everything else I need for a big system refresh... just need the damn GPU.

  13. #773
    The Lightbringer Shakadam's Avatar
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,206
    Quote Originally Posted by Vegas82 View Post
    No, just the plans for a 3080 20GB(not the same thing as it would have the same performance just double the vram).

    - - - Updated - - -



    You mean aside from a deal with Nvidia or the like right?
    Yeah I guess given enough money I suppose someone will make something as a showpiece for Nvidia, but beyond that I don't see any future in RTX I/O.

  14. #774
    Quote Originally Posted by Shakadam View Post
    Yeah I guess given enough money I suppose someone will make something as a showpiece for Nvidia, but beyond that I don't see any future in RTX I/O.
    I’m just remembering hairworks and Witcher 3. Hehe

  15. #775
    Quote Originally Posted by Vegas82 View Post
    I’m just remembering hairworks and Witcher 3. Hehe
    Or excessive amounts of tesselation in their sponsored titles.

  16. #776
    I am Murloc! Mister K's Avatar
    Join Date
    May 2010
    Location
    Under your desk
    Posts
    5,438
    I do wonder how some of these AMD cards will turn out in terms of performance for video editing (Resolve user). Hopefully Puget will get some benchmarks soon. My hopes are low tho since NVIDIA dominates due to NVENC.
    Last edited by Mister K; 2020-11-04 at 07:29 PM.
    -K

  17. #777
    Quote Originally Posted by Vegas82 View Post
    I’m just remembering hairworks and Witcher 3. Hehe
    Looked great. Horrible performance with it enabled though.
    i7-6700K @ 4.6GHz cooled by Thermalright Silver Arrow IB-E Extreme | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II / Samson SR850 | BenQ XL2411T + LG 24MK430H-B

  18. #778
    Quote Originally Posted by Thunderball View Post
    Looked great. Horrible performance with it enabled though.
    Truly the rtx of its day... at least they’re working on improving rtx.

  19. #779
    Quote Originally Posted by Mister K View Post
    I do wonder how some of these AMD cards will turn out in terms of performance for video editing (Resolve user). Hopefully Puget will get some benchmarks soon. My hopes are low tho since NVIDIA dominates due to NVENC.
    Dont get your hopes up, it's going to be shit. Would be amazing if it even does anything on launch.

    - - - Updated - - -

    Quote Originally Posted by Vegas82 View Post
    Truly the rtx of its day... at least they’re working on improving rtx.
    Well I never really had any compaints about how it all looked (apart from DLSS 1.0 ofc). I can go as far as remember early WoW that looked amazing (compared to other MMOs at the time) due to some Nvidia stuff they used. The problem was always how widespread it was (and that's not going to change - proprietary APIs are going to be niche, although a lot of effects from GameWorks are pretty much industry standard atm) and the performance hits, which they need to work on.
    i7-6700K @ 4.6GHz cooled by Thermalright Silver Arrow IB-E Extreme | ASRock Fatal1ty Z170 Gaming K6+ | 16GB Corsair Vengeance LPX DDR4-3000/CL15 @ 3200/CL14 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II / Samson SR850 | BenQ XL2411T + LG 24MK430H-B

  20. #780
    Quote Originally Posted by Shakadam View Post
    Yeah I guess given enough money I suppose someone will make something as a showpiece for Nvidia, but beyond that I don't see any future in RTX I/O.
    RTX I/O is just a marketing name for windows direct storage, just like RTX for windows DXR (and vulkans implementation). Its not a nvidia only thing.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •