Page 2 of 2 FirstFirst
1
2
  1. #21
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Thunderball View Post


    Which is only usable for SATA M.2 SSDs, which pretty much dont exist.



    Joke Unboxed? Lovely. Another one who has no clue how PrecisionBoost works. Temperature indeed doesnt matter if you have an OC'd CPU. It does matter when it's boosting itself though.

    I'm all for choosing a B550 board if you have the money and want to upgrade in the future. But upgrading with that VRM? I dont think so.
    It supports 2 NVME m.2 SSDs. 1 PCI-E 4.0 x4 and 1 PCI-E 3.0 x2


    The Tomahawk uses a Vcore VRM with 4 phases and double high- and low-side mosfets, the Pro4 uses a phase doubled 6 phase Vcore VRM with single high- and low-side mosfets of roughly similar quality. There's not exactly a mile-wide quality difference between those VRM's.

    You've been listening too much to Buildzoid if you judge a motherboard on nothing but its VRM quality. The truth is that unless the VRM is trash tier and throttles a stock CPU, it doesn't matter for 99,9% of everyone who buys a PC.

  2. #22
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Shakadam View Post
    We really don't know that yet. I mean yes of course it will work fine with pci-e 3.0 but we don't yet know if there might be some performance benefits with pci-e 4.0.
    Quote Originally Posted by Thunderball View Post
    It would perform way better with gen4 instead of gen3, especially with all of it hardware being properly used.
    The only reason it will perform better on PCIe 4 than PCIe 3 is if it cannot get enough bandwidth on PCIe 3 to properly keep the cores working.
    For that to be the case, you either need to be running a PCIe benchmark, or have a GPU literally twice as powerful as a 2080ti.

    Unless PCIe 4 offers latency improvements over PCIe 3, of course, but I don't think it does. At least I've never heard that it would.

  3. #23
    Quote Originally Posted by Shakadam View Post
    It supports 2 NVME m.2 SSDs. 1 PCI-E 4.0 x4 and 1 PCI-E 3.0 x2
    Yep, second slot runs off the chipset which is going to be unusable.

    Quote Originally Posted by Shakadam View Post
    The Tomahawk uses a Vcore VRM with 4 phases and double high- and low-side mosfets, the Pro4 uses a phase doubled 6 phase Vcore VRM with single high- and low-side mosfets of roughly similar quality. There's not exactly a mile-wide quality difference between those VRM's.
    There's a mile-wide difference in quality of those mosfets. The reason Tomahawk has the reputation it does is because back in B450 days everyone was using 4C10s and 4C06s (and Asrock were using SM4337s and SM4336s, which are even worse, the B550 Pro4 VRM is pretty much the same as it's been, so mosfets are still the same), but MSI were using 4C29s and 4C24s.

    Quote Originally Posted by Shakadam View Post
    You've been listening too much to Buildzoid if you judge a motherboard on nothing but its VRM quality. The truth is that unless the VRM is trash tier and throttles a stock CPU, it doesn't matter for 99,9% of everyone who buys a PC.
    I'm not an overclocker like Buildzoid, i.e. I dont judge a motherboard just by it's VRM capability, but I certainly prefer to suggest quality VRMs for gaming PC knowing how much people hate cleaning their PCs.

    - - - Updated - - -

    Quote Originally Posted by Temp name View Post
    The only reason it will perform better on PCIe 4 than PCIe 3 is if it cannot get enough bandwidth on PCIe 3 to properly keep the cores working.
    For that to be the case, you either need to be running a PCIe benchmark, or have a GPU literally twice as powerful as a 2080ti.

    Unless PCIe 4 offers latency improvements over PCIe 3, of course, but I don't think it does. At least I've never heard that it would.
    Cores are going to keep working regardless, but you're gonna lose in frametime consistency A LOT.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  4. #24
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Thunderball View Post
    Cores are going to keep working regardless, but you're gonna lose in frametime consistency A LOT.
    Based on what data?

  5. #25
    Quote Originally Posted by Temp name View Post
    Based on what data?
    CPUs and GPUs are designed for different kind of workloads: CPUs are designed for sequential code execution, GPUs are designed for parallel execution. The interaction (simply put) is: CPU cores > CPU cache > system bus (PCIe or memory controller if interacting with RAM) > GPU > Display. That's the hardware, but since the software side is not that important for this I'll leave the API out. So when there's a bottleneck anywhere between CPU and GPU that's not related to the driver/API you start losing fps and frametime consistency because CPU doesnt give the GPU enough code to keep all the "threads" going. The design of the GPU ultimately fails because CPU and GPU are still doing cycles but the GPU is doing less work because it cannot utilize all the cores. If the bottleneck is on the GPU side (GPU itself or VRAM), which is intented, you're not going to lose frametime consistency.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  6. #26
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Thunderball View Post
    CPUs and GPUs are designed for different kind of workloads: CPUs are designed for sequential code execution, GPUs are designed for parallel execution. The interaction (simply put) is: CPU cores > CPU cache > system bus (PCIe or memory controller if interacting with RAM) > GPU > Display. That's the hardware, but since the software side is not that important for this I'll leave the API out. So when there's a bottleneck anywhere between CPU and GPU that's not related to the driver/API you start losing fps and frametime consistency because CPU doesnt give the GPU enough code to keep all the "threads" going. The design of the GPU ultimately fails because CPU and GPU are still doing cycles but the GPU is doing less work because it cannot utilize all the cores. If the bottleneck is on the GPU side (GPU itself or VRAM), which is intented, you're not going to lose frametime consistency.
    But there's not going to be a bottleneck between the CPU and GPU.
    A 2080ti is just barely saturating a PCIe 2.0 x16 slot, a 3080ti would have to chew through data literally twice as fast to saturate a 3.0 x16 slot.
    Even the most optimistic leaks I've seen don't put the 3000 series as twice as fast as the 2000 series.

  7. #27
    And keep in mind, thats not "the card produces twice the framerate" - but rather, the card would have to require twice as much data bandwidth.

    Those are two different things. Given efficiency increases for a new process and new architecture, its possible to get more performance out of a card - and NOT see a steep jump in how much bandwidth it requires. They aren't in lock-step.

  8. #28
    Quote Originally Posted by Temp name View Post
    But there's not going to be a bottleneck between the CPU and GPU.
    A 2080ti is just barely saturating a PCIe 2.0 x16 slot
    That's not true. All the tests that have been exploring that have been 1) extremely flawed - CPU bottlenecks at stuff like that; 2) didnt use the RT hardware at all.

    EDIT: That's no my point though. My point that a pre-top GPU of the next generation (i.e. "4080" or w/e it's going to be called) will be able to leverage PCIe gen4.
    Last edited by Thunderball; 2020-08-21 at 01:58 PM.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  9. #29
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Thunderball View Post
    That's not true. All the tests that have been exploring that have been 1) extremely flawed - CPU bottlenecks at stuff like that; 2) didnt use the RT hardware at all.

    EDIT: That's no my point though. My point that a pre-top GPU of the next generation (i.e. "4080" or w/e it's going to be called) will be able to leverage PCIe gen4.
    But considering the budget OP is working with, it seems unlikely that they'll upgrade to a new GPU when the next gen launches. Most people get new computers once every 5+ years and don't make incremental upgrades.

  10. #30
    Quote Originally Posted by Temp name View Post
    But considering the budget OP is working with, it seems unlikely that they'll upgrade to a new GPU when the next gen launches. Most people get new computers once every 5+ years and don't make incremental upgrades.
    I'm still waiting for chiplet tech to finally get into GPUs so we can finally get into increasing hardware on a single GPU. Nvidia has been going about it for a while and it's not out of hand to expect it to hit very soon. Maybe Intel will do it with their GPUs since they're already doing it with CPUs. This is obviously going to drop GPU prices a lot.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  11. #31
    That's not true.
    Yeah, it is.

    All the tests that have been exploring that have been 1) extremely flawed - CPU bottlenecks at stuff like that;
    1 - source on that absurd claim? And if you're going to think about posting something from Failware Unboxed. They might as well be the dictionary definition of "extremely flawed".

    2 - There is no amount of CPU bottlenecking that would reduce the PCIe bandwidth used by a GPU by over 50%. A 2080Ti JUST BARELY saturates a PCIe [u]2.0[/o] 16x slot.

    2) didnt use the RT hardware at all.
    Uhh.. cool? The RT hardware doesn't require exponentially more bandwidth.... because its done in hardware, on the card. . The only additional information that has to get sent is "Ray Trace from these light sources" and the hardware does the rest.

    EDIT: That's no my point though. My point that a pre-top GPU of the next generation (i.e. "4080" or w/e it's going to be called) will be able to leverage PCIe gen4.
    For that to be a thing, the 3080/Ti will have to literally DOUBLE the amount of data sent through the PCIe lanes. And then so would the 4000 series (in order for PCIe 4.0 to be necessary).

    And you'd have to be putting a Halo Product or top-level Enthusiast product in a budget rig.

    Which almost no one does.

    I know its hard for AMD grognard fanbois to parse, but PCIe 4.0 is YEARS away from having any meaningful impact for a consumer. For a prosumer, or full on professional who wants to run tons of GPUs for rendering or a specific app-related reason (A lot of CAD software will take advantage of every CUDA core you can throw at it), then yeah, PCIe 4.0 has immediate application.

    But for gamers and average users? None. Hell, they could simply add some additional PCIe 3.0 lanes to consumer CPUs and that could push off PCIe 4.0 being relevant to consumers for 4-6 years (because the primary use that a gamer/daily user is going to get out of PCIe lanes is installing additional NVMe drives without sucking bandwidth away from the GPU; add 8 more lanes to the consumer platforms and the average consumer could safely ignore PCIe 4.0 for years).

    PCIe 4.0, like a lot of other things that get done in consumer electronics (Android phones with 12GB of RAM.... like.. what is even the point of that?) is just for bragging rights and to get people to buy things they dont really need.

    All the customer sees is "PCIe 4.0! Faster! Look at these read/write numbers!" and thinks "YEAAAAHHHHH i need that" without understanding and realizing that because they aren't moving massive single files, that they will literally never see a practical difference between a PCIe 4.0 SSD and a PCIe 3.0 SSD - and really wouldn't even notice the difference coming from a SATA III SSD. They pay more for something they will literally never see a real benefit from.

    The only reason i even advocate for most people asking about builds here to get an M.2 drive (NVMe or SATA) is because they are only marginally more expensive and there are no wires. The speed difference is never really even part of the equation, unless they are doing something semi-pro or fully pro that requires fast storage. If the M.2 drives were as much more expensive than the 2.5" SATA III drives, like they used to be, i'd never recommend one for 90% of the builds here.

    Remember that when you're answering someone's request for build help, you need to be focusing on what they actually need for what they told you about their use cases, not what you think "might be better".

    Also remember that 99% of people dont constantly upgrade their computers. THey put it together (or buy it) and rock it for 4+ years unchanged.

    Thats why i always laugh a little bit when people make outlandish recommendations because "well youll upgrade it!" even though the chances of that arent likely.

  12. #32
    Quote Originally Posted by Thunderball View Post
    2070S is the most sold Turing GPU after the 1650S right now and it's still being produced (which probably means no 3070 for another 2 months or so), so no, it's not going down as of yet.



    We kinda do. 3090 - between $1300 and $1500, 3080 - around $900, 3070 - around $650-700. Also 3090 is the 3080 Ti. AMD wont have an answer for both the 3090 and 3080.
    Well turns out not only is the 3070 = 2080ti, but 3080 is almost 2x. and they cost cheaper!...100% worth waiting a bit longer.

    - - - Updated - - -

    Quote Originally Posted by Kagthul View Post
    Can we stop spreading this nonsense?

    They dont go down. They go out of stock, usually before the next gen launches.
    - - - Updated - - -
    .
    They will have to go down now. No one in their right mind will by anything

  13. #33
    Quote Originally Posted by gaaara View Post
    Well turns out not only is the 3070 = 2080ti, but 3080 is almost 2x. and they cost cheaper!...100% worth waiting a bit longer.

    - - - Updated - - -
    They will have to go down now. No one in their right mind will by anything
    Theyre already out of stock most places. My local Microcenter has 1 - uno - 2070S. No 2080’s, SUPER or otherwise, and a single open box 2080ti.

    Since 3060s likely wont hit until next year (as this launch seems to be mirroring the Turing launch), so 2060 and below will remain priced as they are. Because theres nothing in that segment for 5+ months. If you need a card below 500 right now, youre stuck with 2060S and lower or the 5700XT or lower. Production has already stopped on 2060S and 2060 AFAIK. by the time the 3060 launches, stock will have sold through on the 2060/S, and im sure production will cease on the 16XX cards soon.

    Im really not making this stuff up. THe “prices will drop when the new cards come” has never really been a thing. Not in the last 8-10 generations at least, with few exceptions (AMD ended up with a serious glut of 570 & 580s), but generally the manufacturer’s cease production with enough time to make stock scarce or completely gone by he time the replacements come out. You can literally look the cards up at Passmark’s GPU benchmark site - it has complete price histories.

    “Prices on the old ones go down” is not really a thing that happens in GPUs. Never has been. On the USED market, definitely. But new? Nope.

  14. #34
    Quote Originally Posted by Kagthul View Post
    Theyre already out of stock most places. My local Microcenter has 1 - uno - 2070S. No 2080’s, SUPER or otherwise, and a single open box 2080ti.

    Since 3060s likely wont hit until next year (as this launch seems to be mirroring the Turing launch), so 2060 and below will remain priced as they are. Because theres nothing in that segment for 5+ months. If you need a card below 500 right now, youre stuck with 2060S and lower or the 5700XT or lower. Production has already stopped on 2060S and 2060 AFAIK. by the time the 3060 launches, stock will have sold through on the 2060/S, and im sure production will cease on the 16XX cards soon.

    Im really not making this stuff up. THe “prices will drop when the new cards come” has never really been a thing. Not in the last 8-10 generations at least, with few exceptions (AMD ended up with a serious glut of 570 & 580s), but generally the manufacturer’s cease production with enough time to make stock scarce or completely gone by he time the replacements come out. You can literally look the cards up at Passmark’s GPU benchmark site - it has complete price histories.

    “Prices on the old ones go down” is not really a thing that happens in GPUs. Never has been. On the USED market, definitely. But new? Nope.
    In AU, prices have gone down dramatically, 2080ti's have dropped by 50%, and across the board 2080/2070/etc all dropped in prices by roughly 40%-50%. This stock is basically dead, and they are getting them out at before the RTX30 hits the market. If anyone brought a RTX20, in the last month or so really got screwed.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •