Page 27 of 45 FirstFirst ...
17
25
26
27
28
29
37
... LastLast
  1. #521
    Quote Originally Posted by larix View Post
    Yes because we know SO MUCH about NaVi that we can just call how it will turn out now /s

    From what we do know chances are navi will be like ryzen - small, cheap chips connected with some kind of "fabric" into bigger units - that could very well be game changer. But since we know nothing and it is AMD we are talking about its best to wait till they actually release it.
    - Navi is 2019, by then NV wil lhave its post-Volta ready to roll on 10/7nm
    - Nvidia are doing their own MCM R&D (and have been for awhile) for their own GPU versions of Ryzen/Navi multi-chip interconnects, who knows, they might even beat Navi to the game, or implement it better

  2. #522
    Quote Originally Posted by Evildeffy View Post
    I just checked it again just to be certain and you are incorrect, it required a modified GDDR5 memory controller with some minor changes.
    Making it so that both can be used as long as FW is adjusted as the signaling is identical.
    (Little bit of info here: Linketylink)

    That cannot be done by GDDR6 which brings me to my original point again, cost of different IMCs is massive and requires validation for both GDDR5(X) and GDDR6 instead of just 1 type of memory.

    It's possible of course but highly unlikely with cost prohibitions... we'll see in early 2018 very likely.
    Your article proves my point: GDDR5X does require new memory controllers to achieve higher bandwidth, due to changed addressing architecture. GDDR6 uses the same architecture as GDDR5X, same supporting voltages on the same power architecture, and I also suspect same signalling (although I didnt find any mention of it anywhere). Which leads us to my original point: GDDR6 probably requires less changes to the hardware when going from GDDR5X than when going from GDDR5 to GDDR5X. Nvidia did fine with GDDR5X and GDDR5 inside the same generation, I dont see why they wont be able to do it while it's even easier to do.

    Quote Originally Posted by Evildeffy View Post
    No they hypothesize several possible reasons, not why it actually is like that.
    Undervolting and adjustments can lower power draw without issue, like almost everyone stated it very much felt like Vega FE was rushed.
    You'd be surprised how much can be fixed via drivers.
    They found that for some applications they had to increase it up to 1.140V (down from 1.2V) where as most were fine with 1.080V, which is a huge range. Also, they only tested synthetics and games, not touching professional applications (that are one of the major selling points of Vega FE). We have a long history of this stuff, and drivers never slashed 20% power draw and added 20% performance at the same time.

    Quote Originally Posted by Lathais View Post
    I'm not gonna comment on the other stuff because that's beyond me, but on this part, why does it need to compete with a 1080ti? We don't know the price of this thing yet AFAIK, so competing with a 1080 could be fine. Granted, that means they have nothing competing at the top end, but that;s fine. That's such a small segment of the market that I'm not sure if it's worth the time to try and compete at that level. When the vast majority of people still run 1080p@60hz, who needs something even as powerful as a 1080?
    Because Vega (top silicon) has a huge die (almost exactly matching 1080Ti/Titan Xp), a lot higher power consumption (requiring a stronger VRM) and uses a more expensive type of memory. It's going to be priced comparatively to the 1080Ti, so it has to perform close to the 1080Ti. Vega has been designed (from a production cost standpoint) to be a highend chip, so it has to perform like a highend chip. AMD already has Polaris 20 in the midrange segment (it's fine, nothing wront with it), it's never a good idea to compete with your own products (that has been released recently, moreover).
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  3. #523
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thunderball View Post
    Your article proves my point: GDDR5X does require new memory controllers to achieve higher bandwidth, due to changed addressing architecture. GDDR6 uses the same architecture as GDDR5X, same supporting voltages on the same power architecture, and I also suspect same signalling (although I didnt find any mention of it anywhere). Which leads us to my original point: GDDR6 probably requires less changes to the hardware when going from GDDR5X than when going from GDDR5 to GDDR5X. Nvidia did fine with GDDR5X and GDDR5 inside the same generation, I dont see why they wont be able to do it while it's even easier to do.
    It's funny how I read an entirely different thing out of it than you do.
    Especially with minimal modification allowing both to work at the same time as they are electrically compatible...

    But let's just leave this up to the actual release of said cards and see what we get out of it shall we?

    Quote Originally Posted by Thunderball View Post
    They found that for some applications they had to increase it up to 1.140V (down from 1.2V) where as most were fine with 1.080V, which is a huge range. Also, they only tested synthetics and games, not touching professional applications (that are one of the major selling points of Vega FE). We have a long history of this stuff, and drivers never slashed 20% power draw and added 20% performance at the same time.
    At the same time? No not really.
    Over time? Yep, though far more rare on the power draw.
    It's still easily possible and it was 1.12V according to the video for specifically For Honor only and that one behaves weird in general where an OCed CPU is slower than non-OC.

    Again... let's wait and see... RX Vega isn't too far off.

    Quote Originally Posted by Thunderball View Post
    Because Vega (top silicon) has a huge die (almost exactly matching 1080Ti/Titan Xp), a lot higher power consumption (requiring a stronger VRM) and uses a more expensive type of memory. It's going to be priced comparatively to the 1080Ti, so it has to perform close to the 1080Ti. Vega has been designed (from a production cost standpoint) to be a highend chip, so it has to perform like a highend chip. AMD already has Polaris 20 in the midrange segment (it's fine, nothing wront with it), it's never a good idea to compete with your own products (that has been released recently, moreover).
    Last I checked we know nothing yet of pricing other than the AMD tour saying that between identical systems, 1 being the GTX 1080 and G-Sync vs. the other being RX Vega with FreeSync and the latter combo being 300 USD cheaper.
    Unless you know some things we don't in which case a link is in order.

  4. #524
    Deleted
    Meh, since I had a GTX 1080 I went ahead and bought a X34A Predator as my go to monitor.

    I do wanna do a Ryzen + Vega build later this year for my second house, so It depends on stats and how it goes with pricing.

  5. #525
    Deleted
    Quote Originally Posted by Life-Binder View Post
    - Navi is 2019, by then NV wil lhave its post-Volta ready to roll on 10/7nm
    - Nvidia are doing their own MCM R&D (and have been for awhile) for their own GPU versions of Ryzen/Navi multi-chip interconnects, who knows, they might even beat Navi to the game, or implement it better
    Yes that is all true. But the point is at the moment we know nothing and shouldn't assume the worst or the best. And that's how I read his post "vega is shit they should abandon it and focus harder on next gen while there is still time cuz it looks shit as well!"

  6. #526
    Quote Originally Posted by Evildeffy View Post
    It's funny how I read an entirely different thing out of it than you do.
    Especially with minimal modification allowing both to work at the same time as they are electrically compatible...

    But let's just leave this up to the actual release of said cards and see what we get out of it shall we?
    GDDR5X is not electrically compatible with GDDR5, GDDR5X (same with GDDR6 and HBM2) has external VPP, which requires a separate power circuitry on the board. Result is lower operating voltage and better power efficiency. Signaling and memory training is unchanged in both GDDR5, GDDR5X and GDDR6. You have been argumenting that it's the memory controller that is the problem, where is that argument now?

    Quote Originally Posted by Evildeffy View Post
    At the same time? No not really.
    Over time? Yep, though far more rare on the power draw.
    It's still easily possible and it was 1.12V according to the video for specifically For Honor only and that one behaves weird in general where an OCed CPU is slower than non-OC.

    Again... let's wait and see... RX Vega isn't too far off.
    Except this has never happened before, with any release I remember. Also, Vega is not a completely new architecture, it's still GCN, there is literally no way to get that much with a driver update on a refresh (unless AMD programmers have been withholding driver updates for years).

    We've seen core specs of the top RX Vega cards, it's literally Vega FE with higher clocks.

    Quote Originally Posted by Evildeffy View Post
    Last I checked we know nothing yet of pricing other than the AMD tour saying that between identical systems, 1 being the GTX 1080 and G-Sync vs. the other being RX Vega with FreeSync and the latter combo being 300 USD cheaper.
    Unless you know some things we don't in which case a link is in order.
    I wouldnt trust anything AMD marketing has to say. Managed to fail all their marketing claims about Ryzen despite it being a very succesful product, there was no reason to lie about it.

    - - - Updated - - -

    Quote Originally Posted by larix View Post
    Yes that is all true. But the point is at the moment we know nothing and shouldn't assume the worst or the best. And that's how I read his post "vega is shit they should abandon it and focus harder on next gen while there is still time cuz it looks shit as well!"
    They should've absolutely went for a brand new architecture right away. Yes, they might not have anything to compete with Pascal highend (they might still not have it), but they might have released a competitor to Volta, and would've had more time for it to mature.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  7. #527
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thunderball View Post
    GDDR5X is not electrically compatible with GDDR5, GDDR5X (same with GDDR6 and HBM2) has external VPP, which requires a separate power circuitry on the board. Result is lower operating voltage and better power efficiency. Signaling and memory training is unchanged in both GDDR5, GDDR5X and GDDR6. You have been argumenting that it's the memory controller that is the problem, where is that argument now?
    Still the same one, I'm stating that GDDR5X still can be used with the same controller, you state not and you and I are reading the same thing entirely different from Anand's explanation f.ex.
    Hence why I said let's wait and see, you can write it down and when Volta is out you can point me to it, if they use GDDR5(X) still on the actual Volta die (note: NOT a rebranded Pascal die) I will concede to you, if not you will to me.
    Otherwise this discussion will never end in things we read differently every time.

    Quote Originally Posted by Thunderball View Post
    Except this has never happened before, with any release I remember. Also, Vega is not a completely new architecture, it's still GCN, there is literally no way to get that much with a driver update on a refresh (unless AMD programmers have been withholding driver updates for years).
    GCN is a collection name of architectures which are based upon a certain principle of uArch design.
    It is a completely new uArch and saying it's not is rather stupid because if that were the case they could run it off the same drivers where that is what's holding them back for months now.
    Also drivers over time and ESPECIALLY Crimson Drivers say hello to never having gotten huge boosts.

    Quote Originally Posted by Thunderball View Post
    We've seen core specs of the top RX Vega cards, it's literally Vega FE with higher clocks.
    No you have not, all you've seen are rumours, you've never actually seen the specs, no-one outside AMD has.
    Is 2 weeks really too much for you to wait and see rather than already assume stuff you really shouldn't.. even if it has a logical base.

    Quote Originally Posted by Thunderball View Post
    I wouldnt trust anything AMD marketing has to say. Managed to fail all their marketing claims about Ryzen despite it being a very succesful product, there was no reason to lie about it.
    AMD's marketing plain sucks compared to nVidia, NQA ... but what have they lied about regarding Ryzen exactly?
    Because 99% of all marketing hype junk we've seen so far is users pulling stuff out of context and then blaming AMD for it, see Polaris for that example.

    Quote Originally Posted by Thunderball View Post
    They should've absolutely went for a brand new architecture right away. Yes, they might not have anything to compete with Pascal highend (they might still not have it), but they might have released a competitor to Volta, and would've had more time for it to mature.
    Vega IS an entirely new architecture, again do not confuse it being a GCN derivative being the identical as older GCN cards.
    In fact ... as far as I remember the Vega architecture is Radja Koduri's first brainchild since he came back to AMD, Navi being the 2nd.

    Stating it's the same architecture is the same as saying that Pascal is the same as Fermi.

  8. #528
    Quote Originally Posted by Evildeffy View Post
    Still the same one, I'm stating that GDDR5X still can be used with the same controller, you state not and you and I are reading the same thing entirely different from Anand's explanation f.ex.
    Hence why I said let's wait and see, you can write it down and when Volta is out you can point me to it, if they use GDDR5(X) still on the actual Volta die (note: NOT a rebranded Pascal die) I will concede to you, if not you will to me.
    Otherwise this discussion will never end in things we read differently every time.
    It is in your article: you can run GDDR5X in basically GDDR5 mode (high speed mode on the image), but you need an entirely new controller to support ultra speed mode. It's obviously stupid to use GDDR5X without unlocking it's full potential.

    Quote Originally Posted by Evildeffy View Post
    GCN is a collection name of architectures which are based upon a certain principle of uArch design.
    It is a completely new uArch and saying it's not is rather stupid because if that were the case they could run it off the same drivers where that is what's holding them back for months now.
    Also drivers over time and ESPECIALLY Crimson Drivers say hello to never having gotten huge boosts.
    I wonder why we've seen nothing but rebrands from AMD since HD 7000 series up to Polaris. Also, GamersNexus made a video about that aswell: there is literally no difference between Vega FE and Fury X on the same clocks, the only difference being better tessalation performance due to introduction of small primitives discarder. It's either current drivers completely have no idea about any architectural improvements, or they simply do not exist.

    Quote Originally Posted by Evildeffy View Post
    No you have not, all you've seen are rumours, you've never actually seen the specs, no-one outside AMD has.
    Is 2 weeks really too much for you to wait and see rather than already assume stuff you really shouldn't.. even if it has a logical base.
    Well if you as a person operate on logic there is plenty of information already. I just really hope they make top Vega reference cards watercooled.

    Quote Originally Posted by Evildeffy View Post
    AMD's marketing plain sucks compared to nVidia, NQA ... but what have they lied about regarding Ryzen exactly?
    Because 99% of all marketing hype junk we've seen so far is users pulling stuff out of context and then blaming AMD for it, see Polaris for that example.
    Being on par with Intel in gaming performance, beating Broadwell-E in synthetics/professional applications... I'm wrong actually, they did say that Zen is an about 60% architectural improvement over FX series, which is probably true. AMD hypes up their products too much, and nothing almost ever lives up to the hype. I dunno, you can argue that it's fan's fault, but it's AMD themselves who constantly fuel that hype.

    Quote Originally Posted by Evildeffy View Post
    Vega IS an entirely new architecture, again do not confuse it being a GCN derivative being the identical as older GCN cards.
    In fact ... as far as I remember the Vega architecture is Radja Koduri's first brainchild since he came back to AMD, Navi being the 2nd.

    Stating it's the same architecture is the same as saying that Pascal is the same as Fermi.
    It's a very bad new architecture if it's new. Looks like a Fury X refresh right now.

    Nvidia are not known to introduce entirely new architectures, every next generation builds upon the previous one, but the gains are consistently significant, for the most part. Thing is AMD is too far behind right now, they need something completely new, improvements are not going to cut it here.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  9. #529
    Quote Originally Posted by Thunderball View Post
    Being on par with Intel in gaming performance, beating Broadwell-E in synthetics/professional applications... I'm wrong actually, they did say that Zen is an about 60% architectural improvement over FX series, which is probably true. AMD hypes up their products too much, and nothing almost ever lives up to the hype. I dunno, you can argue that it's fan's fault, but it's AMD themselves who constantly fuel that hype.
    I don't really believe this to be true. AMD come sout and shows something, without quite all the info there. An FPS counter missing, clocks not shown, other things missing. Then, the "fan base" hypes and hypes and hypes while AMD says nothing and just let's the hype build. So it's not really them. They really don't do anything worse than any other company out there, only showing the absolute best performance, while leaving out certain info. It's the fans who add hype, not AMD.

  10. #530
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thunderball View Post
    It is in your article: you can run GDDR5X in basically GDDR5 mode (high speed mode on the image), but you need an entirely new controller to support ultra speed mode. It's obviously stupid to use GDDR5X without unlocking it's full potential.
    Like I said .. the point is that a the controller has been slightly modified yes (GDDR5) but you can use that new controller to run both GDDR5 and GDDR5X, they are really (like they state in the article) electrically compatible.

    1 design, 1 validation, cheaper expenses and R&D... you know how this works.

    Quote Originally Posted by Thunderball View Post
    I wonder why we've seen nothing but rebrands from AMD since HD 7000 series up to Polaris. Also, GamersNexus made a video about that aswell: there is literally no difference between Vega FE and Fury X on the same clocks, the only difference being better tessalation performance due to introduction of small primitives discarder. It's either current drivers completely have no idea about any architectural improvements, or they simply do not exist.
    Because rebrands happen, old stock gets cleared, no need to add more.
    nVidia did the exact same thing, even more so in the past. (GTX 680 -> GTX 770 anyone or how about the infamous G92 chip?)
    Just because there's no difference between a Vega FE and Fury X on the same clocks doesn't mean they are the exact same.
    Of course they will share similarities as they are based on the same specs, just with an updated uArch, Sandy Bridge is the same up to Kaby Lake, same base different architecture.

    Quote Originally Posted by Thunderball View Post
    Well if you as a person operate on logic there is plenty of information already. I just really hope they make top Vega reference cards watercooled.
    Unconfirmed information, we can deduce things all we want but as an example the original GTX Titan (notice the GTX moniker) was the exact same die as the GTX 780Ti but with a difference of limiting DP artificially on their architecture.. this was only found out later by people who bought the card cheaper for production purposes and were slapped with a huge performance penalty.
    Or how about the GTX 970 debacle? Same principles.
    The point I'm making is that until you have verified specifications everything remains a rumour.

    Quote Originally Posted by Thunderball View Post
    Being on par with Intel in gaming performance, beating Broadwell-E in synthetics/professional applications... I'm wrong actually, they did say that Zen is an about 60% architectural improvement over FX series, which is probably true. AMD hypes up their products too much, and nothing almost ever lives up to the hype. I dunno, you can argue that it's fan's fault, but it's AMD themselves who constantly fuel that hype.
    None of those statements are really a lie, they never claimed superior performance to Kaby Lake but BW-E and it does a good job at that.
    So ... if they show that information and people hype it up more.. who's fault is that? AMD's or the people assuming things.
    And of course they will say it's great... it's a new product they're bringing out, they can't really say "Well we suck at this, but we rock at this!" when promoting a product, they just focus on it's strengths.. they haven't lied, same thing happened with the Polaris architecture and people made it bigger than it was.
    Not AMD's fault.

    Quote Originally Posted by Thunderball View Post
    It's a very bad new architecture if it's new. Looks like a Fury X refresh right now.

    Nvidia are not known to introduce entirely new architectures, every next generation builds upon the previous one, but the gains are consistently significant, for the most part. Thing is AMD is too far behind right now, they need something completely new, improvements are not going to cut it here.
    That's what they claim Navi is for .. though I will keep it open with a "I'll wait and see" approach, if I need an upgrade I'll grab what is best right now for me.
    If in the future that changes, cool beans, if not... too bad.

    Just because certain products are behind does not necessarily mean it's bad.
    As long as the price is right... everything has a place.

  11. #531
    Quote Originally Posted by Evildeffy View Post
    Just because certain products are behind does not necessarily mean it's bad.
    As long as the price is right... everything has a place.
    Exactly. Like I said, who cares if it performs better than a 1080ti if it's priced like a 1070? Just like everything else, you can speculate on the price because of the higher cost memory and power draw and whatever else, but until they tell us and we see them for sale places, we don't know what the price will be. We can speculate, but don't treat that speculation as fact, as too many people do too often.

  12. #532
    Quote Originally Posted by Evildeffy View Post
    Because rebrands happen, old stock gets cleared, no need to add more.
    nVidia did the exact same thing, even more so in the past. (GTX 680 -> GTX 770 anyone or how about the infamous G92 chip?)
    Just because there's no difference between a Vega FE and Fury X on the same clocks doesn't mean they are the exact same.
    Of course they will share similarities as they are based on the same specs, just with an updated uArch, Sandy Bridge is the same up to Kaby Lake, same base different architecture.
    They do, but not 3 generations in a row. Sandy Bridge is vastly inferior to Kaby Lake in IPC, which is not the case with Vega and Fury.

    Quote Originally Posted by Evildeffy View Post
    Unconfirmed information, we can deduce things all we want but as an example the original GTX Titan (notice the GTX moniker) was the exact same die as the GTX 780Ti but with a difference of limiting DP artificially on their architecture.. this was only found out later by people who bought the card cheaper for production purposes and were slapped with a huge performance penalty.
    Or how about the GTX 970 debacle? Same principles.
    The point I'm making is that until you have verified specifications everything remains a rumour.
    Ok, so you're proposing that AMD is artificially holding Vega FE back? That's fanboism to the max. They cannot afford it right now.

    Quote Originally Posted by Evildeffy View Post
    None of those statements are really a lie, they never claimed superior performance to Kaby Lake but BW-E and it does a good job at that.
    So ... if they show that information and people hype it up more.. who's fault is that? AMD's or the people assuming things.
    And of course they will say it's great... it's a new product they're bringing out, they can't really say "Well we suck at this, but we rock at this!" when promoting a product, they just focus on it's strengths.. they haven't lied, same thing happened with the Polaris architecture and people made it bigger than it was.
    Not AMD's fault.
    They claimed that Zen is on par with Intel in gaming performance (they didnt specify which one, obviously people are going to be looking at Intel mainstream, since that's what Ryzen competes with), and Broadwell-E in Cinebench. Both false. This a hype creator. All that jazz with the "more cores the better" is absolutely a fanbase creation, but that's not the problem. I wouldnt even open my mouth about gaming capabilities if I worked for AMD, I would focus on how much cores you can get for $350, which is obviously the strongest point of Ryzen. People would've figured it out that some Ryzens are decent for gaming, and it would've worked in favor of AMD.

    I wouldnt say Polaris was overhyped, people just expected AMD to release more than mid-to-low range products with it aswell, which resulted in people percieving an RX 480 as a GTX 1080 competitor.
    Last edited by Thunderball; 2017-07-20 at 10:22 PM.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  13. #533
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thunderball View Post
    They do, but not 3 generations in a row. Sandy Bridge is vastly inferior to Kaby Lake in IPC, which is not the case with Vega and Fury.
    You should check out how far the G92 chip went.. you'd be surprised.

    Quote Originally Posted by Thunderball View Post
    Ok, so you're proposing that AMD is artificially holding Vega FE back? That's fanboism to the max. They cannot afford it right now.
    Where in the name of Zeus' butthole did I propose that?
    I stated that whilst we have rumours we have no solid information, and until you do nothing is a certainty.
    I'm literally the last person you want accuse of being a fanboy, or would you like to see all my Intel builds?

    Quote Originally Posted by Thunderball View Post
    They claimed that Zen is on par with Intel in gaming performance (they didnt specify which one, obviously people are going to be looking at Intel mainstream, since that's what Ryzen competes with), and Broadwell-E in Cinebench. Both false. This a hype creator. All that jazz with the "more cores the better" is absolutely a fanbase creation, but that's not the problem. I wouldnt even open my mouth about gaming capabilities if I worked for AMD, I would focus on how much cores you can get for $350, which is obviously the strongest point of Ryzen. People would've figured it out that some Ryzens are decent for gaming, and it would've worked in favor of AMD.
    All the comparison info was given, and it was always BW-E they compared to in gaming barring a few internal slides which compared it to a 7700K and showed the 7700K in their slides still being faster, they still didn't lie.
    It's not their fault if people can't or are unwilling to read the small asterisks detailing system information.

    Quote Originally Posted by Thunderball View Post
    I wouldnt say Polaris was overhyped, people just expected AMD to release more than mid-to-low range products with it aswell, which resulted in people percieving an RX 480 as a GTX 1080 competitor.
    It was overhyped by the user base, A LOT... AMD only said a VR Capable card with the minimum class of R9 290/GTX 970 would be in people's homes for 200 USD.
    People dragged that out to be higher performance points... how is that AMD's fault?

  14. #534
    Quote Originally Posted by Evildeffy View Post
    You should check out how far the G92 chip went.. you'd be surprised.
    It was in 8000 series somewhere, was also in notebooks everywhere. Probably went in to a lot of budget cards then. Nothing new, honestly, GF108 is another example. Extreme budget cards were always like that. AMD just rebranded most of their product line, including the chips, two times, including highend cards.

    Quote Originally Posted by Evildeffy View Post
    Where in the name of Zeus' butthole did I propose that?
    I stated that whilst we have rumours we have no solid information, and until you do nothing is a certainty.
    I'm literally the last person you want accuse of being a fanboy, or would you like to see all my Intel builds?
    Vega FE is out, Apple released the specs for their Vega based cards, all indicate that Vega FE is the top silicon they got. The only thing they can improve at this point is clocks, and clock scaling is pretty easy to predict. We also have Fury X pricing, there is no way to expect top RX Vega to be significantly cheaper. We dont know the end level of performance, that's for sure, but we can for sure tell that it wont compete with GTX 1080Ti.

    You are the first person on this forum to accuse of AMD fanboism, dont fool yourself.

    Quote Originally Posted by Evildeffy View Post
    All the comparison info was given, and it was always BW-E they compared to in gaming barring a few internal slides which compared it to a 7700K and showed the 7700K in their slides still being faster, they still didn't lie.
    It's not their fault if people can't or are unwilling to read the small asterisks detailing system information.
    I'll remind.

    http://cdn.wccftech.com/wp-content/u...MD-Ryzen_2.png - This is Cinebench. AMD ended up using 2 channel memory and downclocking i7-6900K to get those results. There was a bunch of other similarly conducted "tests" in that presentation.

    http://cdn.wccftech.com/wp-content/u...erformance.jpg - Those are results against a 7700K, both stock 1440p, using a GTX 1070. They also had a 6800K with a similar handicap as before. Basically bottlenecking the system to hide the difference in performance.

    If that's not lying that's at least misleading. Only one slide focusing on price to performance (misleading aswell, showing 6900K on par with R7 1700 in "composite performance").
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  15. #535
    The FreeSync vs GSync thing is a terrible comparison and just rationalizing poor price/performance ratios. Bottom line is, most people do not replace their monitor whenever they replace their GPU or even do a full system build, and anyone that wants one of those technologies probably already has it. If anything, using FreeSync is actually a deterrent to adoption, because any one that wanted performance above the mainstream GTX 1060 level has been forced into Nvidia cards for 2+ years now, meaning that they probably have GSync monitors that would lose functionality if they switched manufacturers.

    Not only that, but any perceived advantage in having the cheaper/open source sync technology is more than 100% nullified by the PSU and power usage requirements. If they want to use the monitor pricing thing as their rationalization, surely they should also factor in the price of a better PSU that Vega will need AND the extra electrical costs over the lifespan of the card.

  16. #536
    Quote Originally Posted by Tiberria View Post
    The FreeSync vs GSync thing is a terrible comparison and just rationalizing poor price/performance ratios. Bottom line is, most people do not replace their monitor whenever they replace their GPU or even do a full system build, and anyone that wants one of those technologies probably already has it. If anything, using FreeSync is actually a deterrent to adoption, because any one that wanted performance above the mainstream GTX 1060 level has been forced into Nvidia cards for 2+ years now, meaning that they probably have GSync monitors that would lose functionality if they switched manufacturers.

    Not only that, but any perceived advantage in having the cheaper/open source sync technology is more than 100% nullified by the PSU and power usage requirements. If they want to use the monitor pricing thing as their rationalization, surely they should also factor in the price of a better PSU that Vega will need AND the extra electrical costs over the lifespan of the card.
    There are people that aren't in the 1080+ price range, let alone the extra GSync pricing. There is definitely a market for FreeSync. The market for Vega will be determined by the price and performance.

  17. #537
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thunderball View Post
    It was in 8000 series somewhere, was also in notebooks everywhere. Probably went in to a lot of budget cards then. Nothing new, honestly, GF108 is another example. Extreme budget cards were always like that. AMD just rebranded most of their product line, including the chips, two times, including highend cards.
    Very well .. humour me .. what flagship has been rebranded 2 times according to you by AMD.
    Also the G92 chip wasn't used in notebooks and was rebranded I believe 5 times in total, nVidia chip here.
    Please don't act like both companies are saints, they are not and rebranding does make sense from a financial and stock PoV.

    Quote Originally Posted by Thunderball View Post
    Vega FE is out, Apple released the specs for their Vega based cards, all indicate that Vega FE is the top silicon they got. The only thing they can improve at this point is clocks, and clock scaling is pretty easy to predict. We also have Fury X pricing, there is no way to expect top RX Vega to be significantly cheaper. We dont know the end level of performance, that's for sure, but we can for sure tell that it wont compete with GTX 1080Ti.

    You are the first person on this forum to accuse of AMD fanboism, dont fool yourself.
    So because we know Vega FE's specs and abilities we are 100% sure of what RX Vega will be?
    I just gave you a prime example of why that can change very quickly.

    We can assume things but we can never know things until it is revealed and because I prefer to go along with the scientific way of cold hard facts rather than assumption makes me a fanboy? Get over yourself.

    Your method of life is "Oh look, the brother of that guy is a terrorist, therefore we can logically assume his brother and the rest of his family is too!".
    I am SUCH an AMD Fanboy that I have an Intel Core i7-990X, Intel Core i7-4960X, Intel Core i7-3770K and SuperMicro X9SPV-M4 with Intel Core i7-3555LE (server) along with my self-built NAS system which has the ASUS E45M1I-Deluxe motherboard with an integrated AMD Fusion E450 CPU/GPU, what a HUGE AMD Fanboy I am.

    I also have a GTX 1080 (GigaByte G1 Gaming), an ASUS GTX 760 ITX, an MSI Radeon R9 390X Gaming and an integrated Intel HD graphics.

    Yes I am SUCH a huge AMD fanboy that I have a total of 2 products of theirs, 1 being a low power NAS/HTPC mobo and 1 graphics card vs. the 2 nVidia cards and 4 other Intel based systems.

    I correct people and get corrected if I'm wrong, all of which I've admitted to in the past if I were wrong.
    I post what are rumours and what are facts, because you are unwilling to wait for facts and rather work on assumption makes me a fanboy?

    That is arrogance to the first degree of stupidity, you may want to work that way and no-one will stop you but don't come in here presenting things as fact when we aren't 100% sure of if they are correct in the first place.

    I'll ask you in the simplest of terms for you to understand:
    Do you know for a fact what the EXACT specifications and drivers are for RX Vega that is not a rumour?
    If the answer to that question is "No" or "Yes because I know Vega FE" then the answer is still a "No".

    Conjecture/Assumption != Facts.

    Quote Originally Posted by Thunderball View Post
    I'll remind.

    http://cdn.wccftech.com/wp-content/u...MD-Ryzen_2.png - This is Cinebench. AMD ended up using 2 channel memory and downclocking i7-6900K to get those results. There was a bunch of other similarly conducted "tests" in that presentation.
    Funny how in the Horizon event it was still winning against the 6900K with superior clocks and quad channel memory and still does.
    Cinebench cares not for Dual/Quad channel memory, results are still the same, don't see the point of you posting a correct score still.

    Quote Originally Posted by Thunderball View Post
    http://cdn.wccftech.com/wp-content/u...erformance.jpg - Those are results against a 7700K, both stock 1440p, using a GTX 1070. They also had a 6800K with a similar handicap as before. Basically bottlenecking the system to hide the difference in performance.
    Funny thing is that was a press slide that was leaked, if you read the actual slide you'll see that they hid nothing and explicitly gave system details to the press there.
    Also funny is how both a 6900K was pitted against a Ryzen 7 1800X in the horizon event with the same specs, only CPU difference, and it still beat the 6900K without hamstrings.
    You want to talk about leaks yet fail to include launch event data as well where there was no deception.
    Or were the results (that you could test for yourself) that they showed entirely fake even though the entire community supported it?

    Quote Originally Posted by Thunderball View Post
    If that's not lying that's at least misleading. Only one slide focusing on price to performance (misleading aswell, showing 6900K on par with R7 1700 in "composite performance").
    It's showing the 6900K above the R7 1700 and R7 1700X and below the R7 1800X in performance but not value, there's nothing wrong with that slide, if anything it's gotten better with price cuts.
    Or are you telling me right now straight to everyone's faces that Intel has the better price/performance ratio?

    I'd like to know the answer to that question.

    Answer with logic and facts, no conjecture.
    And if you can't do that then don't answer at all.

  18. #538
    Quote Originally Posted by Gray_Matter View Post
    There are people that aren't in the 1080+ price range, let alone the extra GSync pricing. There is definitely a market for FreeSync. The market for Vega will be determined by the price and performance.
    I'd argue that at the pirce/performance range that AMD has been sitting at recently (with the RX 480/580 only playing at the 1060 level), that there is no market for a premium monitor feature set like FreeSync. They would need to sit at the 1080+ performance level and stay there for it to be all that relevant, and I don't know that coming in at that performance level now, a full year after it's been available through Nvidia cards is all that compelling. You would have to think that almost anyone that wants the 1080 performance/GSync has probably already bought a 1080 or 1080 Ti and already bought or is saving for a GSync monitor.

  19. #539
    Quote Originally Posted by Tiberria View Post
    I'd argue that at the pirce/performance range that AMD has been sitting at recently (with the RX 480/580 only playing at the 1060 level), that there is no market for a premium monitor feature set like FreeSync. They would need to sit at the 1080+ performance level and stay there for it to be all that relevant, and I don't know that coming in at that performance level now, a full year after it's been available through Nvidia cards is all that compelling. You would have to think that almost anyone that wants the 1080 performance/GSync has probably already bought a 1080 or 1080 Ti and already bought or is saving for a GSync monitor.
    Actually, adaptive sync is better for lower end card. If you have a higher end card and are pushing past your monitor refresh rate, adaptive sync does nothing for you. As a replacement for V-Sync, what it does do is prevent your dips below 60 showing you 30FPS even when you are at 55. So here's a PCGamer 16 game average:
    http://www.pcgamer.com/geforce-gtx-1060-review/

    With a 1080p 60hz monitor with either G-Sync or Freesync, the only cards that are even going to benefit from adaptive sync are the 380X, 380, 960 and 950. Adaptive Sync would be doing absolutely nothing for the rest. If you step up to a 75hx Freesync, it benefits the 480 but not the 1070/1080. Even if you had a 120hx monitor, it's not doing anything at all for the 1080. You'd need a 1080p 144hz G-Sync before the G-Sync makes a difference on the 1080.


    For some reason, a whole lot of people seem to think Adaptive Sync is for top-end video cards. This is not true, top-end video cards blow past the FPS where Adaptive Sync makes any sort of difference. The lower end card you have the more adaptive sync helps you.

  20. #540
    Quote Originally Posted by Lathais View Post
    Actually, adaptive sync is better for lower end card. If you have a higher end card and are pushing past your monitor refresh rate, adaptive sync does nothing for you. As a replacement for V-Sync, what it does do is prevent your dips below 60 showing you 30FPS even when you are at 55. So here's a PCGamer 16 game average:
    http://www.pcgamer.com/geforce-gtx-1060-review/

    With a 1080p 60hz monitor with either G-Sync or Freesync, the only cards that are even going to benefit from adaptive sync are the 380X, 380, 960 and 950. Adaptive Sync would be doing absolutely nothing for the rest. If you step up to a 75hx Freesync, it benefits the 480 but not the 1070/1080. Even if you had a 120hx monitor, it's not doing anything at all for the 1080. You'd need a 1080p 144hz G-Sync before the G-Sync makes a difference on the 1080.


    For some reason, a whole lot of people seem to think Adaptive Sync is for top-end video cards. This is not true, top-end video cards blow past the FPS where Adaptive Sync makes any sort of difference. The lower end card you have the more adaptive sync helps you.
    That's because at the budget range where you are considering lower end cards, you probably don't have the extra budget to invest in a premium monitor upgrade. And, if you do have that extra $100-$200 price premium, most people would rather just put it into a better GPU or CPU than the monitor.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •