Page 36 of 46 FirstFirst ...
26
34
35
36
37
38
... LastLast
  1. #701
    Quote Originally Posted by Reinaerd View Post
    going from a Gsync Predator Z35 to a Samsung lc49hg90 with freesync, i dont see any difference outside of screensize, no tearing no ghosting, it works fine.
    I think freesync supports both Nvidia aswell as AMD, whilest Gsync is a Nvidia only thing.
    Gsync Compatible, which is just Freesync, that nvidia slapped a sticker on, for money, works almost as well as with the Gsync module itself.
    Freesync has gone into freesync 2 and is rebranded into Freesync premium because of this.

  2. #702
    Quote Originally Posted by Jin View Post
    Mate, that driver issue was at the start, things have changed.. if you retest everything now, things will be a lot closer and AMD will be even better price per dollar/performance, as most likely will be the 6000 series.
    1) Start is the most important
    2) AMD has issues with almost every single release of the driver
    3) A lot of bugs arent fixed for ages
    4) GPU acceleration/AMD specific graphics features support has always been shit

    - - - Updated - - -

    Quote Originally Posted by mrgreenthump View Post
    Hmm. Do you mean crushed by selling more. Then yeah Nvidia is gonna crush the 6000 series too. Nvidia has always sold more than AMD or even ATI. No matter how fast or slow their cards were comparing to the competition. People just won't entertain the fact that they have competition or the competition is in any way viable.

    In a perfect world we'd have no fanboyism, but it's not gonna happen, so even if AMD is somehow sandbagging their performance atm, Nvidia will still likely sell more.
    Yea, AMD has had amazing hardware for years, but software has always held them back. And that has nothing to do with fanboism, a lot of people are simply not willing to deal with AMD GPUs, even though price/performance is usually better.

    - - - Updated - - -

    Quote Originally Posted by Jin View Post
    Gsync Compatible, which is just Freesync, that nvidia slapped a sticker on, for money, works almost as well as with the Gsync module itself.
    Freesync has gone into freesync 2 and is rebranded into Freesync premium because of this.
    Let's not get too fanboish here. Freesync is literally VESA Adaptive Sync that AMD slapped their branding on with zero difference. Nvidia has had as much right to slap their Gsync compatible sticker on it.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  3. #703
    Quote Originally Posted by Reinaerd View Post
    going from a Gsync Predator Z35 to a Samsung lc49hg90 with freesync, i dont see any difference outside of screensize, no tearing no ghosting, it works fine.
    I think freesync supports both Nvidia aswell as AMD, whilest Gsync is a Nvidia only thing.

    Onto the GPU topic, the paper launch of the RTX30** series left a bad taste, combine this with their super debacle the previous gen,
    the fact they left their AIB partners under NDA, so they could double their own exposure on their FE cards over the backs of their AIB's is dirty. Also their disgusting price hike the previous generation due to Crypto bollocks over the back of gamers they claim to hold in such high regard.
    first time in a decade i am seriously considering going for an AMD card, price vs performance is just THAT much better.
    I bought a Gsync monitor a while back, its good. Love it. But it cost basically double what other monitors were going for and now if I switch to AMD from NVIDIA its a waste.

  4. #704
    Quote Originally Posted by Uurdz View Post
    I bought a Gsync monitor a while back, its good. Love it. But it cost basically double what other monitors were going for and now if I switch to AMD from NVIDIA its a waste.
    Dunno how old it is but you can always sell it, and since the Gsync Compatible or Freesync (2) are cheaper, maybe it isn't as bad?

  5. #705
    Quote Originally Posted by Thunderball View Post
    Yea, AMD has had amazing hardware for years, but software has always held them back. And that has nothing to do with fanboism, a lot of people are simply not willing to deal with AMD GPUs, even though price/performance is usually better.
    Got to admit it's a really long time since ATI was a thing, but from what I remember back then they were miles ahead of Nvidia at times with better drivers and they still got outsold. AMD has had some bad drivers, but not really that frequently. I'd said as frequently as Nvidia, but Nvidia has the money to fix things instantly, while from Hawaii forward AMD just had no money to put on anything.

    Radeon 5000 launch was a bit iffy, but it wasn't really the drivers, but a hardware bug that caused almost everything. Granted that is a bigger oof by far.

    All things considered am quite hopeful that they've gotten their things sorted out. Maybe we have to wait for a while(afterall they were still hiring for important roles in driver department not too long ago) to have everything running in good order. Guess we just have to wait and see how it goes. It's never a good idea to buy a GPU(or any tech) at launch anyways.

  6. #706
    I really want them to come out with a 3080Ti/Super with 12GB GD6X
    maybe even on TSMC 7+, if possible

    that would be my pick most likely

  7. #707
    Quote Originally Posted by Life-Binder View Post
    I take TPU over anyone else

    We also don't know that this is the highest end "big Navi". They just said it's an "RX 6000 series". They could have shown a glimpse of their 3080 competitor and have an even higher end model in store for the 3090 competition.
    about 0.1% chance of that happening
    Good educated guess you've made right there, shows you know the stuff!

    Quote Originally Posted by Life-Binder View Post
    I really want them to come out with a 3080Ti/Super with 12GB GD6X
    maybe even on TSMC 7+, if possible

    that would be my pick most likely
    And I would like 6900xtx with ddr6x instead of plain ddr6 as well as wider databus and also on 5nm TSMC /s


    This is not how things work dude. Chip design takes a lot of time, effort and is node specific. It is not as simple as printing, you can't just go to different fab and ask them to print you new chips. It would take some work AND a lot of money - masks are not free or cheap. So yeah they COULD do that but it would not be even remotely feasible from market perspective. They gotta stick to what they've got, maybe cut chips differently, maybe adjust pricing but I don't see them porting it to different fab. Not unless they find themselves in deep deep trouble. They might do "refresh" though down the line..

  8. #708
    Quote Originally Posted by Life-Binder View Post
    I really want them to come out with a 3080Ti/Super with 12GB GD6X
    maybe even on TSMC 7+, if possible

    that would be my pick most likely
    From the rumors.. They are doing slightly cut down 3090 with 12 gigs of memory calling it a 3080ti. And a slightly cut down 3080 for a 3070ti. Hopefully they push the pricing down comparing the rasterization performance and just not trying to get a price premium for better DRX. And ofc AMD counters with price drops. And everyone rejoices for the competition.. Exept the ones who got a 3090, 3080 or a 3070.

  9. #709
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,867
    I'm just hopeful AMD really beats 3080 in 3rd party benches, because it will make my upgrade - 3080Ti - come faster.

  10. #710
    Quote Originally Posted by mrgreenthump View Post
    From the rumors.. They are doing slightly cut down 3090 with 12 gigs of memory calling it a 3080ti. And a slightly cut down 3080 for a 3070ti. Hopefully they push the pricing down comparing the rasterization performance and just not trying to get a price premium for better DRX. And ofc AMD counters with price drops. And everyone rejoices for the competition.. Exept the ones who got a 3090, 3080 or a 3070.
    Cut down 3090 chip (GA102-300) is 3080 chip (GA102-200). The difference between them is so small that I dont think there's space to make something in between. Making a GPU on 3090 chip but with 12GB of VRAM is going to kill off the 3090, which Nvidia wont let happen.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  11. #711
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,867
    Quote Originally Posted by Thunderball View Post
    Cut down 3090 chip (GA102-300) is 3080 chip (GA102-200). The difference between them is so small that I dont think there's space to make something in between. Making a GPU on 3090 chip but with 12GB of VRAM is going to kill off the 3090, which Nvidia wont let happen.
    They definitely have a headroom to cut down full GA102 in such a way it's better than what 3080 has now.

    It will be the usual Titan vs Ti thing, where Ti is practically titan with just one cluster off and in this case half the memory too or maybe even a bit less.

  12. #712
    Quote Originally Posted by Gaidax View Post
    They definitely have a headroom to cut down full GA102 in such a way it's better than what 3080 has now.

    It will be the usual Titan vs Ti thing, where Ti is practically titan with just one cluster off and in this case half the memory too or maybe even a bit less.
    Do you have evidence the chip has the 'headroom' you call it? Its likely they will just release super versions that bump up the voltage applied to the cards more then anything.

  13. #713
    the difference between 3090 and 3080 is ~15%
    enough room for a Ti with 12GB and 384-bit bus

  14. #714
    Quote Originally Posted by Gaidax View Post
    They definitely have a headroom to cut down full GA102 in such a way it's better than what 3080 has now.

    It will be the usual Titan vs Ti thing, where Ti is practically titan with just one cluster off and in this case half the memory too or maybe even a bit less.
    Usual Titan vs Ti thing already exists in 3090 vs 3080. 3090 is the Titan, just rebranded.

    - - - Updated - - -

    Quote Originally Posted by Life-Binder View Post
    the difference between 3090 and 3080 is ~15%
    enough room for a Ti with 12GB and 384-bit bus
    3080Ti with 384-bit bus and 12GB VRAM is literally 3090 with 12GB VRAM, so it's going to perform the same in gaming.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  15. #715
    Quote Originally Posted by Thunderball View Post
    3090 is the Titan, just rebranded.
    Yeah, no it isn't. It uses Geforce drivers and is a Geforce card. Titans use professional drivers specific to them.

    3080Ti with 384-bit bus and 12GB VRAM is literally 3090 with 12GB VRAM, so it's going to perform the same in gaming.
    Well ye that's the rumor. That they are preparing to release a 9984 CUDA core GA102 with 12 gigs of memory. Which is barely cut down 3090 and really they should release it.. 6900XT is so close and is $500 cheaper. So matching AMD with a bit lower price would kill the 6900XT.. Unless AIBs find a way to clock the 6900XT higher with a lot more power.. Afterall there is a rumor of an Asus 6800XT hitting above 2.5GHz boost clocks.

    3090 will still remain for those that truly need the extra VRAM.

  16. #716
    Quote Originally Posted by Thunderball View Post
    Cut down 3090 chip (GA102-300) is 3080 chip (GA102-200). The difference between them is so small that I dont think there's space to make something in between. Making a GPU on 3090 chip but with 12GB of VRAM is going to kill off the 3090, which Nvidia wont let happen.
    Exactly right. The performance between the two is so slight, there's really no room for a 3080ti. I have the EVGA 3080 FTW Ultra, and OC'd it reaches almost what a stock 3090 can do. Vram hasn't yet been a limiting factor obviously, otherwise a 3090 with 24GB of DDR6x would benchmark significantly higher than a 3080. I never understood why people thought Nvidia would bring out the 3080ti with 20GB Vram. It would literally produce the same results as the current 3080, yet cost a lot more as well as bog down Nvidia's already limited production. It would make no sense.

    AMD looks to have a strong lineup this time, which is awesome. Competition is great, and the end result is better options for everyone and often more reasonable prices. I'm reserving judgement until benchmarks though, as just because something looks great on paper doesn't mean it necessarily translates directly to real world results.

    In the end, I don't care which is better. It's just cool we're at a point where there might actually be options.

  17. #717
    Quote Originally Posted by Uurdz View Post
    I bought a Gsync monitor a while back, its good. Love it. But it cost basically double what other monitors were going for and now if I switch to AMD from NVIDIA its a waste.

    yeah i had the same thought about my second screen, but i've decided to sell it to a friend for a small price. That way he gets a decent screen for his 3070, and i get some cash towards the new GPU.

    - - - Updated - - -

    Quote Originally Posted by mrgreenthump View Post
    From the rumors.. They are doing slightly cut down 3090 with 12 gigs of memory calling it a 3080ti. And a slightly cut down 3080 for a 3070ti. Hopefully they push the pricing down comparing the rasterization performance and just not trying to get a price premium for better DRX. And ofc AMD counters with price drops. And everyone rejoices for the competition.. Exept the ones who got a 3090, 3080 or a 3070.
    Yes they did cancel their double memory versions of the 3080/3090 due to supply line issues,
    https://www.techpowerup.com/273637/n...070-16-gb?cp=2

    So thats probably the route they will take, i much rather would have had them drop prices a bit, they owe us for punishing gamers for the false crypto boom.

  18. #718
    Quote Originally Posted by mrgreenthump View Post
    Yeah, no it isn't. It uses Geforce drivers and is a Geforce card. Titans use professional drivers specific to them.
    No, they dont. They use the same kind of driver other GeForce cards use.

    Quote Originally Posted by mrgreenthump View Post
    Well ye that's the rumor. That they are preparing to release a 9984 CUDA core GA102 with 12 gigs of memory. Which is barely cut down 3090 and really they should release it.. 6900XT is so close and is $500 cheaper. So matching AMD with a bit lower price would kill the 6900XT.. Unless AIBs find a way to clock the 6900XT higher with a lot more power.. Afterall there is a rumor of an Asus 6800XT hitting above 2.5GHz boost clocks.

    3090 will still remain for those that truly need the extra VRAM.
    Yea, rumors are stupid sometimes.

    - - - Updated - - -

    Quote Originally Posted by Chult View Post
    AMD looks to have a strong lineup this time, which is awesome. Competition is great, and the end result is better options for everyone and often more reasonable prices. I'm reserving judgement until benchmarks though, as just because something looks great on paper doesn't mean it necessarily translates directly to real world results.
    I dont think prices are that unreasonable this time, bar the 3090, which is supposed to be prohibitively expensive. AMD is not an option for me and a lot of people because of drivers and no CUDA (for the lack of alternative).
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  19. #719
    Titan
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    America's Hat
    Posts
    14,143
    Quote Originally Posted by Uurdz View Post
    I bought a Gsync monitor a while back, its good. Love it. But it cost basically double what other monitors were going for and now if I switch to AMD from NVIDIA its a waste.
    Uhhh, both work interchangably...

  20. #720
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Rennadrel View Post
    Uhhh, both work interchangably...
    AMD GPUs don't worth with G-sync.
    G-sync is a specialized circuit board made by Nvidia that's integrated into the monitors.

    There are G-sync compatible displays that don't have that, but those aren't G-sync monitors

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •