Page 8 of 46 FirstFirst ...
6
7
8
9
10
18
... LastLast
  1. #141
    Quote Originally Posted by msdos View Post
    What does this even mean and why are you responding so seriously?!?!?! Holy crap, a robot is addressing me. (About to go on ignore because he doesn't make sense!!)
    It makes perfect sense if you know a thing or 2 about GPU's and their marketing.

  2. #142
    I'm happy with my current 1070 but honestly i might just upgrade for cyberpunk anyway.

  3. #143
    Herald of the Titans pansertjald's Avatar
    15+ Year Old Account
    Join Date
    Jul 2008
    Location
    Denmark
    Posts
    2,500
    Quote Originally Posted by msdos View Post
    What does this even mean and why are you responding so seriously?!?!?! Holy crap, a robot is addressing me. (About to go on ignore because he doesn't make sense!!)
    It's you that doesn't make sense. Temp is right
    AMD Ryzen 7 7800X3D: Gigabyte X670 Aorus Elite AX: G.Skill Trident Z5 Neo RGB DDR5-6000 C30 : PowerColor Radeon RX 7900 GRE Hellhound OC: CORSAIR HX850i: Samsung 960 EVO 250GB NVMe: fiio e10k: lian-li pc-o11 dynamic XL:

  4. #144
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Hoofey View Post
    It makes perfect sense if you know a thing or 2 about GPU's and their marketing.
    Doesn't need to be GPU marketing, just marketing in general.
    You probably don't see a lot of ads for Ferraris or Bugattis, or any other similar product because of the mindshare they already have. Why would Ferrari pay someone for an ad when they have millions of people around the world talking about their cars for free already? But you'll see a lot of Suzuki ads because.. Well it's Suzuki, no one's going to go on a 10 minute rant about how great their Suzuki is.

    Same thing works with GPUs. Having the best of something, means you're more likely to have people discuss your product for free, so you need less money to get the same result in a marketing campaign.

    Which is also part of why Apple does a bunch of stupid shit like a 1000 dollar monitor stand. It gets you in their head, and gets media to cover their stuff for free. It's fantastic mindshare

  5. #145
    The Patient Awelon's Avatar
    10+ Year Old Account
    Join Date
    Nov 2013
    Location
    Northern hemisphere
    Posts
    271
    As long as the price for the 3080Ti won't be over 1500€ I'm alright with it. Feels kinda cheap to buy 3060 or 3070 version.

  6. #146
    i usually stick one series behind im on a 1080ti still i might upgrade to a 2080ti when price drops but who knows i might go for a 3080

  7. #147
    Quote Originally Posted by Awelon View Post
    As long as the price for the 3080Ti won't be over 1500€ I'm alright with it. Feels kinda cheap to buy 3060 or 3070 version.
    Afaik there is no 3080ti. At least yet. Because of the price hike it's called the 3090 now. They may push a 3080ti later if AMD manages to compete.

  8. #148
    The Patient Awelon's Avatar
    10+ Year Old Account
    Join Date
    Nov 2013
    Location
    Northern hemisphere
    Posts
    271
    Quote Originally Posted by mrgreenthump View Post
    Afaik there is no 3080ti. At least yet. Because of the price hike it's called the 3090 now. They may push a 3080ti later if AMD manages to compete.
    That's what I'm afraid of. It's kinda a stretch to spend nearly 2000€ on a GPU (that's of course only a guess, but I'm quite sure it will closer to 2k) and I can't justify spending that much on it.

  9. #149
    Quote Originally Posted by Awelon View Post
    That's what I'm afraid of. It's kinda a stretch to spend nearly 2000€ on a GPU (that's of course only a guess, but I'm quite sure it will closer to 2k) and I can't justify spending that much on it.
    My limit is really at 600€ with taxes. I don't want to support higher pricing than that.

  10. #150
    I am waiting for them incredibly strong, I want to update my hardware as soon as possible.

  11. #151

  12. #152
    Quote Originally Posted by mrgreenthump View Post
    Afaik there is no 3080ti. At least yet. Because of the price hike it's called the 3090 now. They may push a 3080ti later if AMD manages to compete.
    There was no 3080Ti planned. The card that the leaks was calling 3080Ti was always 3090.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  13. #153
    Quote Originally Posted by Temp name View Post
    That's.. A huge jump between the two. I figured we'd have the 3090 with 12-16. 24GB is Titan/Quadro territory
    Ok, bit a speculation/reasoning here that i derived from the leaks and confirmations.

    Imho the 3090 is the Titan. Like, a straight out name replacement for a product that was "too much" for gaming and not really aimed towards workstations. With the name change, it gets someway more "identity" as the superduper powerful "want all, have all" gaming GPU. It's like the Titan something very few people will actually get, and we all will drool at the incredible performances. Just a marketing move.

    As for the memory: if we think at the standard PCB layout is the GPU in the middle and 4 memory modules on three sides. Lower end cards have the empty slots around the GPU, and even stuff like the 1080ti with 11GB has an "empty spot" for an eventually additional memory chip.

    So, the 3090 has 24GB because it uses "double side" memory. It makes sense with also how the cooler is designed - the back fan is attached to a second vapor chamber/heatpipes so it can cool the memory on the back of the pcb and iirc it's been shown in some leaked images. 24GB is simply 12GBx2 (all slots filled by 1GB modules on each side). You can see in this leaked image (so all precautions taken) how the PCB looks like. EDIT: if the image is true, it's 100% not a FE card. It still has the 3x8pin configuration and you can see the cooler on the upper left being a standard one.
    The big price jump from 3080 to 3090 is "justified" since the cooler is way more costly to make if it needs to be done this way.

    As per the 3080 having only 10GB, again i think it's a marketing move to leave room for a Ti/Super model - so they can easily add 2GB more and make it more powerful. I assume that they're going to "move down the ladder one step" like with the 2000 serie, as the 2070 Super has actually the GPU of a 2080; so, to make a 3080 Super different enough, they're gonna add more memory.

    At least this makes sense in my head
    Last edited by Coldkil; 2020-08-28 at 12:56 PM.
    Non ti fidar di me se il cuor ti manca.

  14. #154
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Coldkil View Post
    As per the 3080 having only 10GB, again i think it's a marketing move to leave room for a Ti/Super model - so they can easily add 2GB more and make it more powerful. I assume that they're going to "move down the ladder one step" like with the 2000 serie, as the 2070 Super has actually the GPU of a 2080; so, to make a 3080 Super different enough, they're gonna add more memory.

    At least this makes sense in my head
    That wouldn't work because it uses a 320bit memory bus so the only options are 10GB or 20GB VRAM. Presumably the AIB partners will make some 20GB models as well.

  15. #155
    I have a 1080TI and the only reason i'd upgrade is for ray tracing in WoW cause I don't think it has support for Ray Tracing. I'm not impressed by 4k gaming when my screen is 27 inches or so, thus no need for it. I use 2k/120hz just fine.
    Member: Dragon Flight Alpha Club, Member since 7/20/22

  16. #156
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Coldkil View Post
    Ok, bit a speculation/reasoning here that i derived from the leaks and confirmations.

    Imho the 3090 is the Titan. Like, a straight out name replacement for a product that was "too much" for gaming and not really aimed towards workstations. With the name change, it gets someway more "identity" as the superduper powerful "want all, have all" gaming GPU. It's like the Titan something very few people will actually get, and we all will drool at the incredible performances. Just a marketing move.

    As for the memory: if we think at the standard PCB layout is the GPU in the middle and 4 memory modules on three sides. Lower end cards have the empty slots around the GPU, and even stuff like the 1080ti with 11GB has an "empty spot" for an eventually additional memory chip.

    So, the 3090 has 24GB because it uses "double side" memory. It makes sense with also how the cooler is designed - the back fan is attached to a second vapor chamber/heatpipes so it can cool the memory on the back of the pcb and iirc it's been shown in some leaked images. 24GB is simply 12GBx2 (all slots filled by 1GB modules on each side). You can see in this leaked image (so all precautions taken) how the PCB looks like. EDIT: if the image is true, it's 100% not a FE card. It still has the 3x8pin configuration and you can see the cooler on the upper left being a standard one.
    The big price jump from 3080 to 3090 is "justified" since the cooler is way more costly to make if it needs to be done this way.

    As per the 3080 having only 10GB, again i think it's a marketing move to leave room for a Ti/Super model - so they can easily add 2GB more and make it more powerful. I assume that they're going to "move down the ladder one step" like with the 2000 serie, as the 2070 Super has actually the GPU of a 2080; so, to make a 3080 Super different enough, they're gonna add more memory.

    At least this makes sense in my head
    So what would be the 2080ti replacement? An unannounced/unleaked 3080ti? Seems weird at this point. Considering how the 3090, 3080, and 3070 have all been leaked to fuck and back.

    Given the price of the 3090 it makes sense, but they might also just pull a 2080ti and bump the price up another 300 dollars because fuck the consumer

  17. #157
    Quote Originally Posted by Strawberry View Post
    I sold my 1080Ti back in April, been using the integrated GPU since then, lol.

    I'm thinking about getting 3080Ti, but there are a ton of rumors floating around.
    3090 with 12gb or 24gb (prices vary from $1399 to $2000).
    No 3080Ti. Straight jump from 3080 to 3090, with 3080 costing $999.
    Performance, 3080 being 30% faster than 2080Ti, 3090 being 50% faster than 2080Ti.
    Nvidia using Samsungs 8nm for their next-gen GPUs, with probable 7nm TMSC GPUs coming next year branded as Super. This lines up with 3080/3090 and no 3080Ti. Now they can milk more money from the top tier car (3090 Super).
    I believe that 3080Ti/3090 will be a bigger jump from 2080Ti than 2080Ti was from 1080Ti. 2xxx series introduced DLSS and RTX and I think that's why Nvidia raised the price. I don't expect 3xxx GPUs to be cheaper, but since they will just introduce improvements and there will be a die shrink, I expect more performance for the same cost or $100-200 more for the top tier GPUs.

    AMD been extremely quiet, there are basically no leaks or rumors.
    We only know that if the specs from next-gen consoles could mean that AMD's top card should/could be some 30% faster than 2080Ti. This makes it competitive with 3080, but not 3090.

    Intel talking about releasing their high-performance GPUs next year. I don't buy this at all, at least not to be competitive with top AMD/Nvidia cards, but it will be interesting to see. I welcome a 3rd player.

    -----

    But with all this information floating around. I'm thinking of other options.
    I can afford a €1500 3080 Ti, or 3090. And I want a top tier card since I game at 4k/120hz.

    However...

    I won't be paying €2000 for the top tier NON-TITAN GPU.
    The next-gen consoles are quite powerful. And they will cost around €500/600. I have already put aside €1500 for 3080Ti/3090.
    But if I sell my GPU-less computer (9900k clocked to 5hz on all cores, 32gb 3600mhz ram, water cooling, 2TB M.2 NVME SSDs, Creative sound card, pretty expensive case/mobo, it's a well-built computer not just to be fast but look good as well), then I could get some €1000-1200. And if I sell my 4k/120hz screen, I'd gather another €700-800 (it's Asus's 43" monitor, costs about €1200 new).
    If I sell everything, I'd have around €3400. €2500 for a laptop with 2070 max-p/2080 max-q so I can play some games like WoW, €500 for a good 1440p monitor, and €500 for a console.
    They say, vote with your wallet, so I just might do that if Nvidia pulls some crazy shit this gen.
    11 days until Nvidia's GeForce event. I assume it won't take long after that until we hear some leaks from AMD.

    What's your plan?
    I can't justify spending 4X price of my entire PC for just a GPU.

  18. #158


    Now this is an interesting video. Even claims that top ampere cards will have a performance loss on Pcie 3. (No numbers on it though)
    Last edited by Hoofey; 2020-08-30 at 01:55 AM.

  19. #159
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Hoofey View Post
    [video=youtube;onZqz0JYtMQ]https://www.youtube.com/watch?v=onZqz0JYtMQ[video]

    Now this is an interesting video. Even claims that top ampere cards will have a performance loss on Pcie 3. (No numbers on it though)
    I'm not buying that it'll perform better on PCIe 4 than PCIe 3. It would need to be literally twice as powerful as a 2080ti, or in some other way crunching through the data it gets twice as fast. And even if it is twice as fast, judging by the difference between performance with a 2080ti between a x8 and x16 slot, we'd be looking at 1-5% better/worse, so for a noticeable difference it would need to be more than twice as fast.

    Also I doubt that Nvidia would spread their launch out that much, they've never done it before.

  20. #160
    Quote Originally Posted by Temp name View Post
    I'm not buying that it'll perform better on PCIe 4 than PCIe 3. It would need to be literally twice as powerful as a 2080ti, or in some other way crunching through the data it gets twice as fast. And even if it is twice as fast, judging by the difference between performance with a 2080ti between a x8 and x16 slot, we'd be looking at 1-5% better/worse, so for a noticeable difference it would need to be more than twice as fast.

    Also I doubt that Nvidia would spread their launch out that much, they've never done it before.
    He didn't even say anything towards that. He said "Nvidia is requiring reviewers of the quadro series to use 3960x, because of NVLink and SLI". Nowhere did he say a 3090 or 3080 would saturate a PCI-E 3.0 x16.

    ---
    Edit: Now that I listen to it again, yeah.. He is making it sound like you shouldn't buy Intel, because of PCI-E 3.0.. When he says that current boards won't support next gen CPU's because of power requirements. So yeah it is sort of silly to think 3.0 somehow magically gets saturated for the forseeable future.
    Last edited by mrgreenthump; 2020-08-30 at 08:09 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •