Thread: Gtx 1080

Page 46 of 103 FirstFirst ...
36
44
45
46
47
48
56
96
... LastLast
  1. #901
    Quote Originally Posted by Remilia View Post
    Just noticed, why is GP104 and P100 so different per SM. It's odd for Nvidia to make two different SM layout, which in GP104's case it might as well be Maxwell 3.0, but for P100 it's completely different which also means that sharing resources for P100 and GP104 are going to be different. If we can even call them the same architecture is already debatable.
    I'd suggest that GP104 and similar /are/ Maxwell's final form, so to speak. The P100 looks exactly like their older Volta designs/concepts - an incredibly powerful GPU with "stacked DRAM" (HBM / HBM2 now). Volta was initially planned to debut in 2017, but for reasons is now a 2018 piece. Pascal in the form of GP104 isn't really Pascal. The P100 is an entirely different beast, with a far different range of abilities to the GP104.

    What we have is:

    1050-1080 series = Maxwell 3.0
    1080Ti/Titan P(?) = Pascal (Originally Volta designation)
    Something else altogether in 2018 = New Volta

  2. #902
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,104
    So... 1080 Ti not until at least 2017 I gather?
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  3. #903
    Quote Originally Posted by DeltrusDisc View Post
    So... 1080 Ti not until at least 2017 I gather?
    That's right from all we've seen. They're probably waiting for a huge ramp in HBM2 production so they can get the price down to sane levels. I'd expect the Titan P (yeah, I'm calling it that now) in Q1 and 1080Ti might not even appear until Q2, which would match up with their standard procedure for launches. If they have enough HBM2 available, they may push for a Q4 '16 launch for the Titan P, but the 1080Ti will be on the back burner until Titan P sales have saturated.

  4. #904
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Shinzai View Post
    I'd suggest that GP104 and similar /are/ Maxwell's final form, so to speak. The P100 looks exactly like their older Volta designs/concepts - an incredibly powerful GPU with "stacked DRAM" (HBM / HBM2 now). Volta was initially planned to debut in 2017, but for reasons is now a 2018 piece. Pascal in the form of GP104 isn't really Pascal. The P100 is an entirely different beast, with a far different range of abilities to the GP104.

    What we have is:

    1050-1080 series = Maxwell 3.0
    1080Ti/Titan P(?) = Pascal (Originally Volta designation)
    Something else altogether in 2018 = New Volta
    Yeah, I was kind of expecting something more like P100 but you know, smaller, less SMs, some FP64 cores axed, etc etc.

    Has nothing to do with it being good or bad, just you know, I kind of wanted to see something different.
    Last edited by Remilia; 2016-05-17 at 11:37 PM.

  5. #905
    Quote Originally Posted by Remilia View Post
    Yeah, I was kind of expecting something more like P100 but you know, smaller, less SMs, some FP64 cores axed, etc etc.

    Has nothing to do with it being good or bad, just you know, I kind of wanted to see something different.
    No, I actually agree on that point. It may sound odd, but I expected more than what the 1080 offers. It's literally maxed out Maxwell. The 60% performance increase over the previous gen (980) makes complete sense with the shrink, but it doesn't seem to really offer much else. The problem's really glaring the more I look at DX12 benchmarks and I'm not seeing anything particularly amazing or special. It's just faster.

    And yes, faster is a good thing, but that really is all it is. There are some neat additions to it, the VR stuff is cool and all, but I'm not seeing anything to do with the actual Pascal in these cards.

  6. #906
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Yeah, I was finding it too lame.

    JHH said that they spent enough money to go to Mars developing Pascal, and what does the market receive? Maxwell 3.0...

    Well whatever. The new Titan and the 1080Ti will probably be based on the GP100.

  7. #907
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by DeltrusDisc View Post
    So... 1080 Ti not until at least 2017 I gather?
    The 1080 Ti is not until AMD releases a faster card. Then Nvidia releases the 1080 Ti.

  8. #908
    Just dropping this here, a OCed 1080 at 2100mhz core with some games. I'm curious to see the board partners version of this card.


  9. #909
    Immortal Evolixe's Avatar
    10+ Year Old Account
    Join Date
    Nov 2009
    Location
    In the Shadows
    Posts
    7,364
    Quote Originally Posted by DeltrusDisc View Post
    To verify the results? Are you serious? Why do you need more sources? This resolution isn't exactly used en masse, not many have these monitors.

    Here, I'll compromise.

    Look at 2560x1440 benches, then look at 4k benches, now lean towards the 2560x1440 scores... That's around where you'd be with this monitor.

    Let's be realistic though, techgage aren't posting fake benchmarks. Jeez...
    Why do you care?

    The point is that I have seen some very different results elsewhere so I'm still kinda at the tip of my seat.
    Like I said before, I'm seriously considering dropping nearly 2k in upgrades on a new monitor and a graphics card.

    That's not a small amount of money for anyone, and seeing before buying is pretty much not happening over here.
    You want something specific.. gonna have to order online.

  10. #910
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,104
    Quote Originally Posted by Evolixe View Post
    Why do you care?

    The point is that I have seen some very different results elsewhere so I'm still kinda at the tip of my seat.
    Like I said before, I'm seriously considering dropping nearly 2k in upgrades on a new monitor and a graphics card.

    That's not a small amount of money for anyone, and seeing before buying is pretty much not happening over here.
    You want something specific.. gonna have to order online.
    Why not share those other results?

    - - - Updated - - -

    Still, 2560x1440, 3440x1440, and 3860x2160 are all just numbers. :P It's all math. There's no reason 3440x1440 should react differently. It's just a different aspect ratio, slightly more pixels than 2560x1440, a decent amount less than 3860x2160.

    Still, I'd take it over both of the other options. It looks wonderful.

    - - - Updated - - -

    I guess I should reply to your initial question of why I care...

    Why do I care? Because it's an odd thing you're purporting to the computer forums, my resident kingdom. I care about what goes on here, possibly more than the admins themselves!

    I am the unofficial king and I have the peoples' betterment and wishes at heart. <3

    My ears and eyes are many, my tongues uncountable, my power unfathomable.

    I AM THE COMPUTER FORUMS.

    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  11. #911
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    https://youtu.be/xtely2GDxhU?t=7739
    tl;dw.
    So they consider FE's the middle of the stack in terms of quality and price point, which isn't that reaffirming for decent cards at 'MSRP' range.

  12. #912
    Quote Originally Posted by Shinzai View Post
    No, I actually agree on that point. It may sound odd, but I expected more than what the 1080 offers. It's literally maxed out Maxwell. The 60% performance increase over the previous gen (980) makes complete sense with the shrink, but it doesn't seem to really offer much else. The problem's really glaring the more I look at DX12 benchmarks and I'm not seeing anything particularly amazing or special. It's just faster.

    And yes, faster is a good thing, but that really is all it is. There are some neat additions to it, the VR stuff is cool and all, but I'm not seeing anything to do with the actual Pascal in these cards.
    Next generation will probably do better and is more of what you expect.

    From what I read the 1080 Nvidea is selling just misses the mark of running 4k at 60 fps, not the end of the world and they will probably get more performance out of it when Nvidea's partners start shipping their overclocked versions.

    The 9XX models look like the very top of the HD graphics era and the 1XXX line looks like the entry to the 4k/VR era.

  13. #913
    Quote Originally Posted by Remilia View Post
    https://youtu.be/xtely2GDxhU?t=7739
    tl;dw.
    So they consider FE's the middle of the stack in terms of quality and price point, which isn't that reaffirming for decent cards at 'MSRP' range.
    Seems like buying an used 980 TI SLI from early adopters is actually a pretty decent alternative.

    Quote Originally Posted by DeltrusDisc View Post
    Die shrink or not, it's a nice improvement regardless, is it not?
    Its an above average performance jump between generations (although not by that much), but an unimpressive one for 2 process nodes and a new architecture(which is none).
    Let me oversimplify: If you would set core speeds of the 1080 to 980ti level they would probably behave almost equal (slight edge to 980ti for having higher speccs). The core speed is something I can overclock. I can't change the architecture of the chip. If they changed the architecture while remaining at the 980ti clocks and achieved the same performance as the 1080, then I would still be able to overclock to insane levels and get a real performance boost. Thats a card I would gladly pay premium for.

    But as it stands there are almost none performance gains from the architecture and almost all gains are straight from higher clocks possible from the node shrink. Seems like Nvidia took a page out of Intels Tick/Tock model.

    Quote Originally Posted by ati87 View Post
    Next generation will probably do better and is more of what you expect.
    Strange I remember people saying that about Maxwell 3.0 ahm Pascal.
    Last edited by Faldric; 2016-05-18 at 07:20 AM.

  14. #914
    Deleted
    Quote Originally Posted by Remilia View Post
    Yeah, I was kind of expecting something more like P100 but you know, smaller, less SMs, some FP64 cores axed, etc etc.

    Has nothing to do with it being good or bad, just you know, I kind of wanted to see something different.
    Isn't it actually quite good from Nvidia to just do a die-shrink... I mean, Maxwell was intended for 16nm but they wanted to bring something new out on 28nm so they did. So they have experience with Maxwell, and now they only have to do a die shrink. Much like Intels Tick-Tock scheme. Which worked out good for Intel.

    Now they have a new design (GP100), which seem to have a low success rate (I have no idea how to call it different at this time, but the amount of functioning chips from the wafer). It probably also only works with HBM (2.0), or they need to put a different memory controller on the different chips. So this would have been an insanely expensive chip to sell to consumers, or they would need to sell it at a loss. So they keep it for the deep-learning stuff at the moment to get some returns. When the process is improved they will probably release a 'normal' gaming card, which might either be just a 1080ti or a completely new series. I'm geussing a completely new series.

    In short, I guess they learned from their past mistakes.... I think, Nvidia usually had issues with the jump to a new production process. A lot more than AMD used to have anyway.

    AMD is doing something similar... They have plenty of experience with GCN, so I doubt AMD would have issues with Polaris as that is also more of an evolution of the design than something brand new.

  15. #915
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Zeara View Post
    Isn't it actually quite good from Nvidia to just do a die-shrink... I mean, Maxwell was intended for 16nm but they wanted to bring something new out on 28nm so they did. So they have experience with Maxwell, and now they only have to do a die shrink. Much like Intels Tick-Tock scheme. Which worked out good for Intel.
    Well the original was Volta = 16nm, Pascal = 20nm, and Kepler 28nm. Maxwell appeared because 20nm Planar's voltage leak was so bad that it was not suitable for GPUs. Maxwell was slapped in because of that issue.
    Now they have a new design (GP100), which seem to have a low success rate (I have no idea how to call it different at this time, but the amount of functioning chips from the wafer).
    Yield.
    It probably also only works with HBM (2.0), or they need to put a different memory controller on the different chips. So this would have been an insanely expensive chip to sell to consumers, or they would need to sell it at a loss. So they keep it for the deep-learning stuff at the moment to get some returns. When the process is improved they will probably release a 'normal' gaming card, which might either be just a 1080ti or a completely new series. I'm geussing a completely new series.

    In short, I guess they learned from their past mistakes.... I think, Nvidia usually had issues with the jump to a new production process. A lot more than AMD used to have anyway.

    AMD is doing something similar... They have plenty of experience with GCN, so I doubt AMD would have issues with Polaris as that is also more of an evolution of the design than something brand new.
    Possibly, their node transitions have never been great, but it's still kind of boring. We have two distinctly different architectures under the same architecture name. One that's more like Maxwell and one that's well, not. The only difference in the block diagram between Maxwell and GP104 is that in Maxwell it's 4 SMs per cluster/GPC and GP104 is 5 SM per cluster. Granted we don't know what's outside of this but it's very... boring. Kind of like I said before, I wanted to see what "Pascal" as in P100 can do, not Maxwell 3.0. This isn't about whether it's good or bad, cause it's a proven architecture that works but, from someone that likes tech's point of view, it's very boring.

    And yes, P100 has HBM as it's memory controller, so it's going to only work with that. It also has Nvlink which is going to have to be axed for consumer use as it's no use and no x86 CPUs can use it anyways.
    Last edited by Remilia; 2016-05-18 at 07:49 AM.

  16. #916
    So being the GPU noob that I am...

    Now that we have the specifications of the GTX 1070 and they are a little lackluster, is it still worth it?

    Or should I save some more money and go for the 1080? I think it would be an overkill with my i7 2600 3.4 GHz though

    Truth be told, I'm only aiming for 1080p, 60 fps, 1920x1080, but now that I'm upgrading (my GTX 550 Ti has served me "well" for 5 years) I want a card that can last for some years.

  17. #917
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by ReVnX View Post
    So being the GPU noob that I am...

    Now that we have the specifications of the GTX 1070 and they are a little lackluster, is it still worth it?

    Or should I save some more money and go for the 1080? I think it would be an overkill with my i7 2600 3.4 GHz though

    Truth be told, I'm only aiming for 1080p, 60 fps, 1920x1080, but now that I'm upgrading (my GTX 550 Ti has served me "well" for 5 years) I want a card that can last for some years.
    We have no idea! No seriously we don't. 25% less Cores/SMs doesn't necessarily translate to 25% less performance. Could be more or less performance drop. In the end just wait for review and see what it's like if you're really aiming for it. Also last time Nvidia posted specs for a certain cut down chip (970) the specs were wrong. I hope people are going to be more thorough this time instead of having the wrong specs being advertised.

  18. #918
    Quote Originally Posted by ReVnX View Post
    So being the GPU noob that I am...

    Now that we have the specifications of the GTX 1070 and they are a little lackluster, is it still worth it?

    Or should I save some more money and go for the 1080? I think it would be an overkill with my i7 2600 3.4 GHz though

    Truth be told, I'm only aiming for 1080p, 60 fps, 1920x1080, but now that I'm upgrading (my GTX 550 Ti has served me "well" for 5 years) I want a card that can last for some years.
    yes 1070 is very worth it, specs dont matter when you have ~Titan X performance for 400-450$

    but its frankly overkill for 1080@60, for that a ~970 is enough .. less future proof, but also costs much less



    and a 1080 is absolutely overkill for 1080@60, even with future proofing in mind

    do NOT get a 1080 for a 1080p@60 monitor .. its wrong

  19. #919
    Quote Originally Posted by Remilia View Post
    We have no idea! No seriously we don't. 25% less Cores/SMs doesn't necessarily translate to 25% less performance. Could be more or less performance drop. In the end just wait for review and see what it's like if you're really aiming for it. Also last time Nvidia posted specs for a certain cut down chip (970) the specs were wrong. I hope people are going to be more thorough this time instead of having the wrong specs being advertised.
    Let's see how the reviews go then.

    To be honest, the comments I have seen regarding the specs have been a little less than apocalyptic, so I kinda freaked out

    Quote Originally Posted by Life-Binder View Post
    yes 1070 is very worth it, specs dont matter when you have ~Titan X performance for 400-450$

    but its frankly overkill for 1080@60, for that a ~970 is enough .. less future proof, but also costs much less

    and a 1080 is absolutely overkill for 1080@60, even with future proofing in mind

    do NOT get a 1080 for a 1080p@60 monitor .. its wrong
    The thing is I want something I can build upon, adding the rest little by little.

    I'm definitely not a big spender, I can't just throw 2000$ out of nowhere
    Last edited by ReVnX; 2016-05-18 at 08:05 AM.

  20. #920
    Deleted
    Quote Originally Posted by Remilia View Post
    Yield.
    Thanks, also just remembered it...

    Possibly, their node transitions have never been great, but it's still kind of boring. We have two distinctly different architectures under the same architecture name. One that's more like Maxwell and one that's well, not. The only difference in the block diagram between Maxwell and GP104 is that in Maxwell it's 4 SMs per cluster/GPC and GP104 is 5 SM per cluster. Granted we don't know what's outside of this but it's very... boring. Kind of like I said before, I wanted to see what "Pascal" as in P100 can do, not Maxwell 3.0. This isn't about whether it's good or bad, cause it's a proven architecture that works but, from someone that likes tech's point of view, it's very boring.

    And yes, P100 has HBM as it's memory controller, so it's going to only work with that. It also has Nvlink which is going to have to be axed for consumer use as it's no use and no x86 CPUs can use it anyways.
    I agree, it is quite boring from a tech standpoint. I also expected something new.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •