Page 25 of 77 FirstFirst ...
15
23
24
25
26
27
35
75
... LastLast
  1. #481
    Y' know, it's a lot harder to sympathize with hardocp when the entire article seems to be nothing but slander. There's almost not a single sentence in that article that attempts to deliver any facts, it just wildly speculates that this and that means AMD is dead.

    I mean c'mon, the location of the conference is now reason to believe AMD are struggling? Really?

  2. #482
    I feel this guys article is a "What If" story, but it's fun to speculate "What If AMD Failed" Will the graphics side of the business go back to ATI, What's going to happen to ZEN. Next time on "Flame Wars". lawls

  3. #483
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Bigvizz View Post
    I feel this guys article is a "What If" story, but it's fun to speculate "What If AMD Failed" Will the graphics side of the business go back to ATI, What's going to happen to ZEN. Next time on "Flame Wars". lawls
    HardOCP takes it self very seriously, but it's biased and corrupt. Years ago they wanted to stop people on their forums from talking about AdBlock, so people were banned. Lots of people believe Kyle Bennett has stock in Amazon and Microsoft cause they post a lot of news that focuses on them. Plus, their version of PCPartPicker called HoverHound is limited to Amazon and TigerDirect. For a while they had issues with NewEgg and any link to NewEgg on their forum will be redirected to Amazon, automatically.

    And who wrote the article? Kyle Bennett did of course. And yes, HardOCP has a hard on for Nvidia so no shock that AMD is neglecting them. So the "article" is just them crying. They did the same thing back when AMD was going to release the Fury.

    Quote Originally Posted by Drunkenvalley View Post
    Y' know, it's a lot harder to sympathize with hardocp when the entire article seems to be nothing but slander. There's almost not a single sentence in that article that attempts to deliver any facts, it just wildly speculates that this and that means AMD is dead.

    I mean c'mon, the location of the conference is now reason to believe AMD are struggling? Really?
    The article is more like a post with everything based on speculation. This is just HardOCP falling on the ground and crying like a child. They still think they're relevant, but they're not.


  4. #484
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,781
    Well I know people love being in denial and so on, but the whole Radeon Group spinoff is done not because of focusing on GPU, but as a potential for sale in case things do turn out badly.

  5. #485
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Gaidax View Post
    Well I know people love being in denial and so on, but the whole Radeon Group spinoff is done not because of focusing on GPU, but as a potential for sale in case things do turn out badly.
    It would end up nicer if they were entirely bought by Samsung. A company that is second place at the manufacturing process race and can throw money at R&D.
    Also the only company that could face Intel but they'd probably just focus on ARM which they're already leading.

  6. #486
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    From the giggles side, if Intel buys it they could just curb stomp everyone. That said, realistically I wouldn't be surprised if Intel wants to buy RTG. Intel and AMD worked together (yeah I know, bizarre) to bring about using external GPU dock through Thunderbolt 3 where as Intel stopped renewing contract with Nvidia and instead went to AMD for it. Intel needs a better GPU for their processors, and with RTG and a better fab + resource they can make it into a monster. This would also finally allow them into a DGPU market where as their first attempt was a massive fail.
    Last edited by Remilia; 2016-05-27 at 11:04 PM.

  7. #487
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Remilia View Post
    From the giggles side, if Intel buys it they could just curb stomp everyone. That said, realistically I wouldn't be surprised if Intel wants to buy RTG. Intel and AMD worked together (yeah I know, bizarre) to bring about using external GPU dock through Thunderbolt 3 where as Intel stopped renewing contract with Nvidia and instead went to AMD for it. Intel needs a better GPU for their processors, and with RTG and a better fab + resource they can make it into a monster.
    Apple is another candidate judging by their exclusiveness culture. What's the best way to bring your A-something chips to Macs without breaking X86 compatibility? Making them x86 yourself lol.

    They have a nice relationship with Intel though, but it isn't unrealistically impossible.

  8. #488
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,781
    Without a doubt Radeon Group would do well outside AMD and in hands of those who give a damn and have a buck or two on their account to throw at it.

    But I'm afraid Intel, Samsung or Apple - all would be bad choices - I don't think anyone there is interested in full blown desktop solutions, they all want mobile.

  9. #489
    Apple....perish the thought. Nobody wants a 800$ ver. x90 or FuryX card, you think Nvidia greedy Oh boy you seen nothing yet.

  10. #490
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Bigvizz View Post
    Apple....perish the thought. Nobody wants a 800$ ver. x90 or FuryX card, you think Nvidia greedy Oh boy you seen nothing yet.
    What makes you believe that Apple would sell standalone video cards if they bought AMD?

  11. #491
    Quote Originally Posted by Artorius View Post
    What makes you believe that Apple would sell standalone video cards if they bought AMD?
    They might not, but if they did prices would be stupid.

  12. #492
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Gaidax View Post
    Well I know people love being in denial and so on, but the whole Radeon Group spinoff is done not because of focusing on GPU, but as a potential for sale in case things do turn out badly.
    I do believe that AMD is preparing for the worst, and rightfully so. If Polaris or Zen don't put a dent in the market, they'll be bought up in pieces. Key thing here is both have to do well. Not one or the other. But I'm not worried about that, because I'm certain AMD did the right thing so far to correct their mistakes. Zen doesn't have to beat or even match Skylake. So long as the CPU is roughly equal to Haswell, the chips will sell well. Considering all the great news I've heard about Zen, I'm wouldn't worry about it. If CERN is looking to use Zen for their servers, it must be good.

    Polaris is tricky. People will criticize AMD for not aiming for high end market, but it really makes sense not to. For years, AMD has been releasing high end GPUs and shifting down older GPUs. This hasn't worked since the older high end GPU's tend to run hot, consume more power, and generally cost more to make. Nearly every high end GPU has been beat by Nvidia, who already focuses on the server market with their GPUs. As much as we like to point how fail AMD is without a competitor to the 1080, they might have actually dodged a bullet. The market for the 1080 is nowhere near as big as the mainstream market. So by focusing on the mainstream, AMD hopes to recreate the dominance that the 970 has. So what if Nvidia is dominant in the $600 card market, when the $200-$350 is where all the action is?

    But unlike Zen where Intel is pretty firm on selling overpriced CPU's, Nvidia isn't going to sit still and offer nothing. They will likely have 1060's and 1050's, which would likely outperform AMD's Polaris. Timing is everything in the GPU market. For now, it seems AMD has caught Nvidia with their pants down. Pascal isn't impressive at all, and while it is the fastest GPU, it's also the most expensive. Even the 1070 is going to be $60 more than the 970 was at launch. Considering how Maxwell like the 1070/1080 are, I really don't think Nvidia has anything ready for the 1060's and 1050's. Nvidia has probably used their engineers to focus on Tegra products instead, is my guess.

    I believe the only way Zen could fail is if Intel drops the prices of Haswell-E and Broadwell-E. This is Intel, so not likely. For Polaris to fail, it would have to be priced like a 1070, but performs like a 390. And that is a probability. On top of that, Nvidia could just release 980 and 980 Ti's as replacements for the 960 and 950. All they would have to do is just rebrand them. And it would certainly give AMD's Polaris trouble until Nvidia has real replacements like the 1060 and 1050. This is why Polaris is more able to fail compared to Zen.

    Quote Originally Posted by Artorius View Post
    What makes you believe that Apple would sell standalone video cards if they bought AMD?
    Exactly. Apple is a technology hoarder. They bought PowerVR. Does anyone really think you'll see new non Apple SoC's with PowerVR in them? Mali graphics is all you'll ever see besides Qualcomm's Adreno.
    Last edited by Vash The Stampede; 2016-05-28 at 03:09 AM.

  13. #493
    Quote Originally Posted by Dukenukemx View Post
    Polaris is tricky. People will criticize AMD for not aiming for high end market, but it really makes sense not to. For years, AMD has been releasing high end GPUs and shifting down older GPUs. This hasn't worked since the older high end GPU's tend to run hot, consume more power, and generally cost more to make. Nearly every high end GPU has been beat by Nvidia, who already focuses on the server market with their GPUs. As much as we like to point how fail AMD is without a competitor to the 1080, they might have actually dodged a bullet. The market for the 1080 is nowhere near as big as the mainstream market. So by focusing on the mainstream, AMD hopes to recreate the dominance that the 970 has. So what if Nvidia is dominant in the $600 card market, when the $200-$350 is where all the action is?
    Enthusiast forums like the aforementioned site will distort your view of reality if you listen too much. These are the types that call the GTX 1080 "midrange" and talk as if a $700 video card is affordable and will sell in huge numbers because everyone games at 4K or VR.

    All I want is something that will run 1920x1080 at high to max settings while using less power and run at lower temperatures than the 750 Ti I have now. Passive would be a plus but I'm not holding out hope for that. I suspect the number of people looking to buy something like that is probably much greater than the market for the GTX 1080.

  14. #494
    GP104 is a mid-range chip though, Nvidia massively overpricing it does not change that fact.

    There are 4 tiers of GPUs generally recognized by the industry / people who have knowledge of hardware. The tiers are not based solely on the performance, but the die size of the chip as well.

    Since this is a completely new generation and on a completely new process node, comparisons the the 980ti are honestly moot at this point. Just because the 1080 is faster doesn't make it the successor to the 980ti, it's the 980 / 770 / 680 successor. After driver updates the 7870 outperformed the 580 and it outperformed the 6970 on launch that didn't make it a high-end card, so neither does it make the 1080 a high-end card.

    Entry level / low-end: ~100mm^2 (GP107 and Polaris 11 should be R 460)
    Mainstream: ~200mm^2 (GP106 and Polaris 10 should be R 470)
    Mid-range / Performance: ~300-400mm^2 (GP104 and Vega 11 should be R 480)
    High-end / Enthusiast: >400mm^2 (GP102 or GP100 and Vega 10 should be R 490)

    But you are right. Only ~5% of the PC gaming market is ever willing to buy a mid-range or high-end card which generally cost $350+. 95% of the gaming market only ever spends about $150 to $300 tops for gpus.

    On 4k, Tech sites always tend to believe that 4k will become the gaming industry standard tomorrow. But fact of the matter is the vast majority of gaming still use 1080p60 monitors. And VR... I honestly believe that it's a useless gimmick that will go the way of 3d monitors. It will be hyped up like no tomorrow as the best thing since sliced bread, and then it will die and only a tiny minority will ever use or talk about it again just like 3d.
    Last edited by Phaere; 2016-05-28 at 09:37 AM.

  15. #495
    Deleted
    Quote Originally Posted by Phaere View Post
    snip
    Just because product X is in company's "middle of the pack" performance wise, and they want it to be mainstream doesn't make it so if it's not priced accordingly. Mainstream is something that reaches/or is affordable by masses. If most people in a market for a gpu are not willing to spend as much for a product as producent want's for it - it just isn't mainstream product. Simple logic.

    You can call/organize performance tiers as you like, name doesn't change what they are
    Last edited by mmoc9ef35a8a9e; 2016-05-28 at 10:25 AM.

  16. #496
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Give up @Phaere, a lot of knowledgeable computer forum posters already tried to explain this countless times but the concept of a GPU is apparently too hard for people to understand.

    You're 100% correct, Nvidia is selling a mid-rage GPU at a incredibly ridiculous price but people think it's fine because it's better than a 980ti which was their previous flagship.

    As if the better performance wasn't obviously going to come from the 28nm planar > 16nm FinFet die shrink which comes with higher achievable clocks under okayish power draw and no thermal throttling (well the 1080 does throttle but let's ignore it for this post, it has more to do with Nvidia's crap cooler).

    I mean, GP104 is basically Maxwell. If you could get a 980ti and clock it at the same clocks as the 1080 they'd perform the same. Actually it would even perform a little better because the only difference between the two decreases IPC of the 1080...

    It's not the first time they're doing this though. They invented their new price bracket with the original Titan and started to sell their normal flagships with mid-range GPUs while the Titan or xx80Ti gets this same big chip, but cut-down for the Ti.

    Which might not even be the case this time apparently. If GP102 is real and there's no Pascal consumer card, this launch will be the most pathetic ever from Nvidia. Extremely overpriced products that have bad performance at current API and are already technologically outdated by default. A 1080 is supposed to be 30~40% better than a Fury X, but somehow they're tying at ashes 4k extreme because Nvidia's DX12 performance didn't improve at all. This is pathetic.
    Last edited by Artorius; 2016-05-28 at 12:44 PM.

  17. #497
    Deleted
    Quote Originally Posted by Phaere View Post
    GP104 is a mid-range chip though, Nvidia massively overpricing it does not change that fact.

    There are 4 tiers of GPUs generally recognized by the industry / people who have knowledge of hardware. The tiers are not based solely on the performance, but the die size of the chip as well.

    Since this is a completely new generation and on a completely new process node, comparisons the the 980ti are honestly moot at this point. Just because the 1080 is faster doesn't make it the successor to the 980ti, it's the 980 / 770 / 680 successor. After driver updates the 7870 outperformed the 580 and it outperformed the 6970 on launch that didn't make it a high-end card, so neither does it make the 1080 a high-end card.

    Entry level / low-end: ~100mm^2 (GP107 and Polaris 11 should be R 460)
    Mainstream: ~200mm^2 (GP106 and Polaris 10 should be R 470)
    Mid-range / Performance: ~300-400mm^2 (GP104 and Vega 11 should be R 480)
    High-end / Enthusiast: >400mm^2 (GP102 or GP100 and Vega 10 should be R 490)

    But you are right. Only ~5% of the PC gaming market is ever willing to buy a mid-range or high-end card which generally cost $350+. 95% of the gaming market only ever spends about $150 to $300 tops for gpus.

    On 4k, Tech sites always tend to believe that 4k will become the gaming industry standard tomorrow. But fact of the matter is the vast majority of gaming still use 1080p60 monitors. And VR... I honestly believe that it's a useless gimmick that will go the way of 3d monitors. It will be hyped up like no tomorrow as the best thing since sliced bread, and then it will die and only a tiny minority will ever use or talk about it again just like 3d.
    out of curiosity why do you equate mid-range to performance, and not to mainstream? (and consequently high end to performance, and very high to enthusiast).

    you basically equate stuff in 100mm2 brackets, but then have no name for the 200-300 bracket (or 150-250mm2 bracket if you prefer). and like you said, the enthusiast tier is a very small segment of the market, so wouldn't it make sens to give that a special name, like very high, rather then mainstream no name?

    or alternatively, why is mainstream not low end, and entry very low end or something.
    Last edited by mmoc982b0e8df8; 2016-05-28 at 01:01 PM.

  18. #498
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Him of Many Faces View Post
    out of curiosity why do you equate mid-range to performance, and not to mainstream? (and consequently high end to performance, and very high to enthusiast).

    you basically equate stuff in 100mm2 brackets, but then have no name for the 200-300 bracket (or 150-250mm2 bracket if you prefer). and like you said, the enthusiast tier is a very small segment of the market, so wouldn't it make sens to give that a special name, like very high, rather then mainstream no name?

    or alternatively, why is mainstream not low end, and entry very low end or something.
    GPU =/= Video Card.

    They're selling a "high-end" Video Card with a mid-range GPU, not exactly something hard to understand. If they had simply followed their normal evolution line then this 1080 would've ended up as the 1060 and GP100/GP102 would've been the 1080/1070. But at the 600 series they decided to put their mid-range chip at their high-end line and create another bracket to sell the actual high-end chip which was the original Titan.

    Realistically speaking they just increased the prices of their products since they're now named with higher numbers and those higher numbers are in higher price-brackets.

    But well, whatever.

  19. #499
    Quote Originally Posted by Him of Many Faces View Post
    out of curiosity why do you equate mid-range to performance, and not to mainstream? (and consequently high end to performance, and very high to enthusiast).

    you basically equate stuff in 100mm2 brackets, but then have no name for the 200-300 bracket (or 150-250mm2 bracket if you prefer). and like you said, the enthusiast tier is a very small segment of the market, so wouldn't it make sens to give that a special name, like very high, rather then mainstream no name?

    or alternatively, why is mainstream not low end, and entry very low end or something.
    Performance and Enthusiast are simple AMD's name for the Mid-range and High-end markets respectively.

    http://cdn.wccftech.com/wp-content/u...2014-20151.jpg

    Everything on the list has basically been dropped down a tier during the 300 series with the release of the Fury card. The 390 cards became the Mid-range cards, then 380 became the mainstream cards, and fury took it's place as the high-end. AMD was forced to do this though because of the release of Maxwell, which at the time spanked AMD's cards. It also helped Nvidia that a lot of the hyped games released before and shortly after Maxwell launch were Gameworks games and Nvidia has very quick driver support.

    The die sizes are more of a "around this area" kind of figure. In practice the chips usually end up a bit bigger:
    GM107 was 148mm^2 GK107 - 118
    GM206 was 227mm^2 GK106 - 228
    GM204 was 398mm^2 GK104 - 294
    GM200 was 601mm^2 GK110 - 561

    AMD 7770 was 123mm^2
    R 360 is 160mm^2
    R 370 is 212mm^2
    R 380 is 359mm^2
    R 390 is 438mm^2
    Fury is 596mm^2

    For the new cards:
    GP104 (the GTX 1080) is 314mm^2
    Polaris 11 should be about 110mm^2 or so
    Polaris 10 according to rumors should be about 232mm^2

    To put things in perspective for everyone the predecessor to the GP104 GTX 1080 was the GF114 GTX 560 ti and it cost $249 on launch and was 360mm^2. Artorius above me is spot on.
    Last edited by Phaere; 2016-05-28 at 04:52 PM.

  20. #500
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Cows For Life View Post
    Enthusiast forums like the aforementioned site will distort your view of reality if you listen too much. These are the types that call the GTX 1080 "midrange" and talk as if a $700 video card is affordable and will sell in huge numbers because everyone games at 4K or VR.

    All I want is something that will run 1920x1080 at high to max settings while using less power and run at lower temperatures than the 750 Ti I have now. Passive would be a plus but I'm not holding out hope for that. I suspect the number of people looking to buy something like that is probably much greater than the market for the GTX 1080.
    I'm pragmatic, and while a GTX 1080 would be cool to have, it would also be overkill for what I need. I have a 1080p monitor, and that monitor is what I'll likely use for years to come. When and if it breaks, it'll most likely be replaced with another 1080p. 4k is cool and all, but it has to be cost effective. 4k is cheaper today, but not as cheap as 1080. So for the moment, I don't care about a GPU like the 1080. Maybe when 4k monitors are as cheap as 1080?

    Also, I've never bought a video card for more than $250. That's the highest I've ever spent. The Radeon HD 7850 I currently use was bought used off Ebay for $100 about 1 year ago, and it still works fine for my gaming. I don't need 200 fps in Doom 4 at 1080p. Since no game I play is suffering from my setup, I see no reason to upgrade. I'd be very tempted with the new Polaris stuff coming out, but something else would have to blow my mind like Half Life 3 with Ray Tracing. VR is not something I'll buy for a while, cause it's just ungodly expensive and no games.

    Quote Originally Posted by Him of Many Faces View Post
    out of curiosity why do you equate mid-range to performance, and not to mainstream? (and consequently high end to performance, and very high to enthusiast).

    you basically equate stuff in 100mm2 brackets, but then have no name for the 200-300 bracket (or 150-250mm2 bracket if you prefer). and like you said, the enthusiast tier is a very small segment of the market, so wouldn't it make sens to give that a special name, like very high, rather then mainstream no name?

    or alternatively, why is mainstream not low end, and entry very low end or something.
    AMD, Intel, and Nvidia need to know their audience, and they clearly don't. Intel's Skylake is a failure, cause it's not enough for someone even with a 2500K to upgrade to. The 1080 is fast, but an overclocked 980 Ti can match or beat it in performance. Overclock the Fury X, and it can give the 1080 a run for its money. If you want people to upgrade, it has to be justifiable.

    Too many people are playing with the numbers as well. Why reference clocked older cards? Why not try to overclock older cards for a comparison? How much noise does the fans make when overclocked, compared to the 1080? People are starting to say it's a mid range card priced like a high end because they're starting to see conditions cherry picked to make the 1080 more impressive looking.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •