Page 19 of 46 FirstFirst ...
9
17
18
19
20
21
29
... LastLast
  1. #361
    Quote Originally Posted by Thunderball View Post
    We know nothing about the laptop 30 series powerdraw so why speculate? Also, AMD doesnt even have high end mobile GPUs.
    This most of this thread is speculation and rumors so far.. But hey at least the review embargo should lift today on the 3080. So that's at least something. Just a shame it's one day before launch. Doesn't give people much time to gather their thoughts.

    And it's not too far fetched of an idea as Nvidia didn't really improve their Perf per Watt. Given we haven't seen a lower clocked Ampere.. All we know they could've done an AMD and just clocked the reference too high. Rumors on the other hand are putting some variant of Big Navi as 3080 +/- 10% and consuming 250-275 watts. And consoles have really low wattage numbers for their performance.

    Also the speculation is also in AdoredTVs latest vid. He basically threw the same idea as an afterthought of the vid.

  2. #362

  3. #363
    I actually have a GTX 680, so I don't feel like I have a massive need, or any need, to upgrade, really.

  4. #364
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Bennett View Post
    That is amazing value - I'd feel so bad if I'd bought a 2080ti
    I only really feel bad for people who bought it within the last couple of months..
    I got my 2080ti at launch pretty much, and it's served me well for almost 2 years.

  5. #365
    Quote Originally Posted by Bennett View Post
    That is amazing value - I'd feel so bad if I'd bought a 2080ti
    It's more like Turing wasn't amazing value. We got to remember 3080 is a GA102 designation die.. So we can compare it to 2080ti. So it's more of a return to the normal pricing(near 1080ti pricing).

  6. #366
    Looking like Nvidia's claims of nearly double the 2080 performance are very false.

  7. #367
    Quote Originally Posted by kaelleria View Post
    Looking like Nvidia's claims of nearly double the 2080 performance are very false.
    Was anyone thinking they wouldn't use the absolute best case possible?

  8. #368
    Quote Originally Posted by mrgreenthump View Post
    Was anyone thinking they wouldn't use the absolute best case possible?
    Very true.

    Some power numbers...
    3080 FE OC ~ 370 watts
    3080 Stock ~ 330 watts

    That's bananas.

  9. #369
    Quote Originally Posted by kaelleria View Post
    Very true.

    Some power numbers...
    3080 FE OC ~ 370 watts
    3080 Stock ~ 330 watts

    That's bananas.
    If you're used to running an OC 1080ti / 2080ti, the stock 3080 numbers wont really change much, as those cards run at 300-330w OC'd. And looking at the OC results, it probably wont be worth to OC the 3080 that much. Will be interesting to see what the AiB boards can do though.

  10. #370
    I'm operating under the assumption that AIB cards like the 3080 Strix OC will use significantly more power than the FE cards as they have in the past. Are we going to see 400 watts?

  11. #371
    Quote Originally Posted by Vegas82 View Post
    The FE is power restricted when it comes to OC, the numbers aren’t very telling right now.
    Yeah, exactly. Will probably see 400w+ AiB cards. Will be great for those cold winter days we got incoming!

  12. #372
    Quote Originally Posted by kaelleria View Post
    That's bananas.
    Considering it's a node shrink.. Sort of. But it's still an efficiency gain over Turing. Albeit a very slight one.

  13. #373
    Quote Originally Posted by mrgreenthump View Post
    Considering it's a node shrink.. Sort of. But it's still an efficiency gain over Turing. Albeit a very slight one.
    Gotta love that Samsung 8nm

  14. #374
    Quote Originally Posted by Bennett View Post

    So is the 1000w psu thing still true?
    It was only ever a thing with the 3090, so we don't know.

  15. #375
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by kaelleria View Post
    Looking like Nvidia's claims of nearly double the 2080 performance are very false.
    They said double performance in ray tracing, or 1.9x performance per watt. Both are true, but also not the whole picture.


    Quote Originally Posted by Bennett View Post
    So is the 1000w psu thing still true?
    Nah, 750w should be more than enough.
    Maybe 850 if you're planning on a 3090.

    I mean, the 3080 is 330-370w, and CPUs are generally <200. So ~550w total, add in 20w more for RAM, RGB stuffs, and you're still below 600w total system. That said, if you want to hit the efficiency sweetspot, you're probably looking at an 1100w.. But that's just a waste of money, really.

  16. #376
    Quote Originally Posted by mrgreenthump View Post
    Considering it's a node shrink.. Sort of. But it's still an efficiency gain over Turing. Albeit a very slight one.
    So where did the 1.9x efficiency came from?

  17. #377
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Yizu View Post
    So where did the 1.9x efficiency came from?
    At ~240w a 3080 1.9x more performance per watt than a 2080

    But it doesn't run at 240w, so.. <.<

  18. #378
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Temp name View Post
    They said double performance in ray tracing, or 1.9x performance per watt. Both are true, but also not the whole picture.



    Nah, 750w should be more than enough.
    Maybe 850 if you're planning on a 3090.

    I mean, the 3080 is 330-370w, and CPUs are generally <200. So ~550w total, add in 20w more for RAM, RGB stuffs, and you're still below 600w total system. That said, if you want to hit the efficiency sweetspot, you're probably looking at an 1100w.. But that's just a waste of money, really.
    You gotta consider transient power peaks as well. Pretty much all new GPU's cause these occasionally as a result of their boosting mechanisms. It's not uncommon to see ~100-200W peaks above their rated power draw. This can trip the overcurrent protection on some PSU's or simply cause voltage dips and crashes on other PSU's. It was an issue already with last gen GPU's and some PSU's (famously the Seasonic Focus GX lineup before they made a new version with more relaxed OCP limits).

    If you have a very high quality 650W PSU you're maybe probably fine but I'd also recommend a high quality 750W.

  19. #379
    https://www.hardwareluxx.de/index.ph....html?start=26

    Some interesting numbers when lowering the power limit. 10%ish lower performance when running at 250w.
    Guess it will be better to just undervolt the cards, usually leads to less power usage while keeping the performance. In some cases you even get better performanceas temps go down.

  20. #380
    Quote Originally Posted by Bennett View Post
    I'm on a 1440 so my 2070s is more than fine for the next few years

    If I were to get a 3080 I'd need a new PSU and likely a new mobo, at that point I'm spending a lot more than I feel is worth

    Just gonna wait for the 40xx series and go for a whole new rig

    - - - Updated - - -



    That is amazing value - I'd feel so bad if I'd bought a 2080ti
    Where's that guy that was saying you'd need a brand new CPU and mobo with PCIE4 to keep from bottlenecking this card? LOL, maybe at 480p or 1080p 240+Hz...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •