Page 2 of 3 FirstFirst
1
2
3
LastLast
  1. #21
    Where is my chicken! moremana's Avatar
    15+ Year Old Account
    Join Date
    Dec 2008
    Location
    Florida
    Posts
    3,618
    @OP you got some facts wrong there sir and should probably correct them or your post is going to get slammed by everyone in the pc arena.

    OT, AMD did drop the ball on Vega. Really bad. But who knows, like the RX480, they may get better with bios fixes and drivers.

  2. #22
    Quote Originally Posted by moremana View Post
    @OP you got some facts wrong there sir and should probably correct them or your post is going to get slammed by everyone in the pc arena.

    OT, AMD did drop the ball on Vega. Really bad. But who knows, like the RX480, they may get better with bios fixes and drivers.
    AMD "may" have dropped the ball for gamers, but they still sell cards faster then they can make them atm with the mining craze, if this continues, they wont care that gamers don't buy as many of them as they will be sold out anyways.

    on the bright side, this should give them more means to devellop a succesor.

  3. #23
    Quote Originally Posted by Kagthul View Post
    I watched 6 different benchmarking videos today, from Linus to Paul to Bitwit and GamersNexus.

    The 56 performs better than the 1070, but does not "crush it" and even heavily overclocked (by Paul) doesnt come near the 1080.

    The 64, meanwhile, is about ~10% slower overall than then 1080, and beats it in a very few games. For the same MSRP. And uses gobs more power.

    And the moment you OC the 1080, the 64 has no hope of catching up, and has no OC headroom to speak of.

    The 56 is a -viable- competitor to the 1070, particularly when OCed (which it seems to have headroom for that the 64 does not), but the 64 is a DOA product. It has no place in the market. Its beaten by its closest competitor, and is almost beaten by its little brother when OCed (the 56 OCed comes with like 5% of the 64).
    This is 100% accurate. So tired of bias reporting.

    - - - Updated - - -

    Quote Originally Posted by Denpepe View Post
    AMD "may" have dropped the ball for gamers, but they still sell cards faster then they can make them atm with the mining craze, if this continues, they wont care that gamers don't buy as many of them as they will be sold out anyways.

    on the bright side, this should give them more means to devellop a succesor.
    Then they should get out of the gaming business. Particularly their marketing department.

  4. #24
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Kagthul View Post
    I watched 6 different benchmarking videos today, from Linus to Paul to Bitwit and GamersNexus.

    The 56 performs better than the 1070, but does not "crush it" and even heavily overclocked (by Paul) doesnt come near the 1080.
    Sweclockers (reputable swedish hardware site) got their OC'ed Vega 56 to trade blows with the stock Vega 64 and GTX 1080. Don't ask me how they did it, silicon lottery?

    I'd expect the Vega 56 to be a very good overclocker once we get AIB versions from Asus, MSI etc to get around the power throttling issues with the stock card.

  5. #25
    Quote Originally Posted by Shakadam View Post
    Sweclockers (reputable swedish hardware site) got their OC'ed Vega 56 to trade blows with the stock Vega 64 and GTX 1080. Don't ask me how they did it, silicon lottery?

    I'd expect the Vega 56 to be a very good overclocker once we get AIB versions from Asus, MSI etc to get around the power throttling issues with the stock card.
    Probably found a way to override the power limit. Most likely with hardware mods. AIB versions wont have a higher power limit, so nothing will change in that regard.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  6. #26
    Deleted
    Of course AMD tries to make top cards that can compete with 1080 Ti. They just dont have a clue. Their cards (both Vega 64 and RX580) are maxed out. They cant be faster.

    After they saw they couldnt match, they stated they didnt want to. Because they can't. I am pretty sure noone at AMD said: lets make slower cards.

  7. #27
    http://www.mmo-champion.com/threads/...cussion-thread

    Nvidia's next generation of Volta GPUs are expected to release in March 2018

    Today some rumours have a emerged which states that Nvidia's gaming-oriented Volta series GPUs will release in March 2018
    https://www.overclock3d.net/news/gpu...n_march_2018/1

  8. #28
    Quote Originally Posted by Klatar View Post
    Of course AMD tries to make top cards that can compete with 1080 Ti. They just dont have a clue. Their cards (both Vega 64 and RX580) are maxed out. They cant be faster.
    Sometimes things just don't go their way, these things are in the pipeline long before their competitors are revealed and both the 900/1000 series have been pretty awesome. AMD are having their fermi moment.
    The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts.

  9. #29
    wasnt fermi the top performer (or at least on par with AMD flagship) but at the cost of a very high power draw ?

  10. #30
    Warchief Zenny's Avatar
    10+ Year Old Account
    Join Date
    Oct 2011
    Location
    South Africa
    Posts
    2,171
    Nvidia GTC is in March, that fits the rumour, Nvidia's own timetable, and GDDR6 availability.

  11. #31
    Quote Originally Posted by Life-Binder View Post
    wasnt fermi the top performer (or at least on par with AMD flagship) but at the cost of a very high power draw ?
    Depending if you could actually combat 480's heat output. Considering the fact that most manufacturers mostly stuck to reference models back then and custom coolers were pretty rare and expensive (custom PCBs even more so) 5870 had an advantage. But Fermi was specifically designed around DX11 and offered a huge advantage in tessallation performance (which Nvidia still holds to this) which resulted in GTX 480 getting better as time went by and new games were released.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  12. #32
    Legendary!
    15+ Year Old Account
    Join Date
    Sep 2008
    Location
    Norway
    Posts
    6,380
    Fine by me. Just got a 1080ti and would love for it to last a couple of years.

  13. #33
    Deleted
    Quote Originally Posted by Afrospinach View Post
    Sometimes things just don't go their way, these things are in the pipeline long before their competitors are revealed and both the 900/1000 series have been pretty awesome. AMD are having their fermi moment.
    Well you can call Nvidias 900/1000 series great - or you can call AMDs recent architecture bad. So many claims were made about architecture improvements. Yet Polaris was just a shrink and Vega 64 up to now just profits from higher ckocks. Vega 64 @ Fury X clock is the same power level like Fury X.

    AMD would definitely need a big jump in architecture. Considering 1080 TI has no HBM Ram, Nvidia is currently worlds ahead. AMD has much stronger RAM. In a perfect world, AMD would be 20-30% ahead of Nvidia, forcing them to use expensive HBM2 ram instead of cheap DDR5 Ram.

    Instead Nvidia focusses on cheap production. So they can maximize their profits. I dont think AMD wanted to sell their cards beneath current 1080 Ti prices. They just have to because performance isnt quite on the expected level.

    And Fury X was at least an opponent for the 980 Ti. Vega 64 isnt an opponent for the 1080 Ti. The same was for 290x/390x being opponents for the gtx 980, but 580 is pretty far behind even a 1070, fighting with a gtx 1060...

    TL DR: The gap between AMD and Nvidia has dramatically increased...
    Last edited by mmoc4ec7d51a68; 2017-08-17 at 08:44 AM.

  14. #34
    Quote Originally Posted by RuneDK View Post
    Please provide sources for these claims.
    Quote Originally Posted by Schattenlied View Post
    Have you read no reviews? Hell, the page YOU linked itself says as much, the 1080ti is the only thing they aren't competing with.
    hahaha. owned. My thoughts is that some fanboy wanted to start an Internet "discussion" when they do not even understand what is going on.
    Quote Originally Posted by Nizah View Post
    why so mad bro

  15. #35
    From Gamers Nexus video on the electric consumption of Vega 56 at a competitive overclock Vega consumes about $111 more electricity than a 1070 overclocked per year running at a high load 24/7, at a cost of $0.12/KWh.

    Also note that Vega draws more power from the wall and Pascal pulls more power over the pci bus.
    Last edited by Linkedblade; 2017-08-17 at 09:14 AM.

  16. #36
    Quote Originally Posted by Linkedblade View Post
    Vega draws more power from the wall
    You probably mean over 12V GPU power.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  17. #37
    Immortal Schattenlied's Avatar
    10+ Year Old Account
    Join Date
    Aug 2010
    Location
    Washington State
    Posts
    7,475
    Quote Originally Posted by Linkedblade View Post
    From Gamers Nexus video on the electric consumption of Vega 56 at a competitive overclock Vega consumes about $111 more electricity than a 1070 overclocked per year running at a high load 24/7, at a cost of $0.12/KWh.

    Also note that Vega draws more power from the wall and Pascal pulls more power over the pci bus.
    And unless you're a cryptocurrency miner you aren't going to be running it at anywhere near high load 24/7 so... yeah.

    Even if you ran it at high load for 8 hours a day 7 days a week (which is still an excessive amount of gaming, lets be honest here), you're looking at more like $37 a year, if that.
    Last edited by Schattenlied; 2017-08-17 at 05:15 PM.
    A gun is like a parachute. If you need one, and don’t have one, you’ll probably never need one again.

  18. #38
    Quote Originally Posted by Schattenlied View Post
    And unless you're a cryptocurrency miner you aren't going to be running it at anywhere near high load 24/7 so... yeah.

    Even if you ran it at high load for 8 hours a day 7 days a week (which is still an excessive amount of gaming, lets be honest here), you're looking at more like $37 a year, if that.
    So an extra 100$ over the average lifetime of the card (they actually do a full breakdown on several use scenarios, including 2-4 hours a day), for a card that is 100$ more expensive and performs worse.

    Wow, what a privelege to own an AMD card!

  19. #39
    Quote Originally Posted by Schattenlied View Post
    And unless you're a cryptocurrency miner you aren't going to be running it at anywhere near high load 24/7 so... yeah.

    Even if you ran it at high load for 8 hours a day 7 days a week (which is still an excessive amount of gaming, lets be honest here), you're looking at more like $37 a year, if that.
    Ya know, if it cost less than it's competition, I could see this as an argument for the competition, because it would even the costs out. It it performed better then it's competition, than I could see this being an argument. However, it already costs more and performs worse, then you have to pay more on top of that.

  20. #40
    Quote Originally Posted by Lathais View Post
    Ya know, if it cost less than it's competition, I could see this as an argument for the competition, because it would even the costs out. It it performed better then it's competition, than I could see this being an argument. However, it already costs more and performs worse, then you have to pay more on top of that.
    Well the AMD sourced cards may be rising in price. Who knows how the aib partners cards will be priced. They could be priced lower than msrp for reference cards. Not to mention free sync monitors being lower cost. They don't perform worse than their Nvidia counterparts. The 56 and 1070 are close, the 64 and 1080 are close.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •