Page 28 of 45 FirstFirst ...
18
26
27
28
29
30
38
... LastLast
  1. #541
    Quote Originally Posted by Tiberria View Post
    That's because at the budget range where you are considering lower end cards, you probably don't have the extra budget to invest in a premium monitor upgrade. And, if you do have that extra $100-$200 price premium, most people would rather just put it into a better GPU or CPU than the monitor.
    Yeah, which is why, IMO, adaptive sync really makes no sense at all yet. It only benefits those with lower end graphics cards, but the money spent for it would be better off spent just getting a better graphics card.

  2. #542
    There really isnt a premium on gsync if you are looking at high end monitors (which people should be putting ahead of GPU in priority list of things to buy).
    https://pcpartpicker.com/products/mo...t=price&page=1

    Cheapest 1440p 144hz panel has gsync. The prices are crazy on the 1080p gsync models, but who buys a 1080p monitor on a PC that costs north of a grand?

  3. #543
    Deleted
    Adaptive sync will still help prevent tearing with high end gpu's, you just have to cap your frames slightly below what your panel can do.

  4. #544
    Well ya adaptive sync is good for any strength of card, but the true benefit of the technology is it allows you to spend less on your graphics card. Anything over ~80-90 FPS is the same to me as 165 FPS, there is no possible way i could differentiate those framerates from each other. If i bought a 1080 or ti down the road all that would allow me to do is increase graphics settings which in most games are hardly distinguishable between high and medium, ultra being a setting i will never use in any game because of its poor optimization.

    A 1060 and a gsync monitor is the best combo people can get into, way better idea than buying a 1080ti on some crappy 1080p monitor.

  5. #545
    Quote Originally Posted by Evildeffy View Post
    Very well .. humour me .. what flagship has been rebranded 2 times according to you by AMD.
    Also the G92 chip wasn't used in notebooks and was rebranded I believe 5 times in total, nVidia chip here.
    Please don't act like both companies are saints, they are not and rebranding does make sense from a financial and stock PoV.
    G92 was used in 8800M and 9800M, both suffered from die separation, people who repair notebooks order them in bulk.

    AMD example: HD 7970 > HD 8970 > R9 280X. To be fair, 8000 series were a rebrand as a whole, but still unacceptable in my opinion. Nvidia's 600-700 series trick was pretty disgusting aswell, just for the reference. I'd personally like to see stuff similar to Nvidia's 900 and 1000 series where (almost) every card is based on a current generation GPU. Just sell older GPUs as they were released if you think that they still have a place on the market.

    Quote Originally Posted by Evildeffy View Post
    So because we know Vega FE's specs and abilities we are 100% sure of what RX Vega will be?
    I just gave you a prime example of why that can change very quickly.

    We can assume things but we can never know things until it is revealed and because I prefer to go along with the scientific way of cold hard facts rather than assumption makes me a fanboy? Get over yourself.

    Your method of life is "Oh look, the brother of that guy is a terrorist, therefore we can logically assume his brother and the rest of his family is too!".
    I am SUCH an AMD Fanboy that I have an Intel Core i7-990X, Intel Core i7-4960X, Intel Core i7-3770K and SuperMicro X9SPV-M4 with Intel Core i7-3555LE (server) along with my self-built NAS system which has the ASUS E45M1I-Deluxe motherboard with an integrated AMD Fusion E450 CPU/GPU, what a HUGE AMD Fanboy I am.

    I also have a GTX 1080 (GigaByte G1 Gaming), an ASUS GTX 760 ITX, an MSI Radeon R9 390X Gaming and an integrated Intel HD graphics.

    Yes I am SUCH a huge AMD fanboy that I have a total of 2 products of theirs, 1 being a low power NAS/HTPC mobo and 1 graphics card vs. the 2 nVidia cards and 4 other Intel based systems.

    I correct people and get corrected if I'm wrong, all of which I've admitted to in the past if I were wrong.
    I post what are rumours and what are facts, because you are unwilling to wait for facts and rather work on assumption makes me a fanboy?

    That is arrogance to the first degree of stupidity, you may want to work that way and no-one will stop you but don't come in here presenting things as fact when we aren't 100% sure of if they are correct in the first place.

    I'll ask you in the simplest of terms for you to understand:
    Do you know for a fact what the EXACT specifications and drivers are for RX Vega that is not a rumour?
    If the answer to that question is "No" or "Yes because I know Vega FE" then the answer is still a "No".

    Conjecture/Assumption != Facts.
    We have enough to make an educated guess. You dont need to be completely right here to gauge RX Vega's market position: it's not like Nvidia has a shitton of cards with a small interval in performance between them. I'm personally not interested in anything Nvidia or AMD releases in the near future as I already have a Pascal card. I want to see how things play out, and success of the architecture very strongly influences the development cycle that follows. Obviously I want competition, I want cards to cost lower, but as it is right now AMD doesnt help themselves or the market at all.

    Things that you assume I'd call you an AMD fanboy is the opposite thing of why I would do that. That's called twisted logic. If one's logic is like that there is no use to it: you'd be drawing useless conclusions all the time. There is no way I can make a scientific opinion of RX Vega because I wont be getting a card for myself. Sure, I might get a chance to play with it but it's hardly enough to draw any conclusions. What you're doing here is manipulating information to fit your personal opinion, which why I'm calling you an AMD fanboy. The whole memory type discussion is a prime example of that.

    I wouldnt all a person who buys inferior products due to brand loyalty a fanboy. I would call him an idiot. You're obviously not. That's why you probably know that you're wrong, that's why you end up trying to pinpoint minor flaws in my statements instead of discussing the matter. If you want to get a card and test it for yourself go on, and dont participate in the discussion. That's your only way of getting the facts.


    Quote Originally Posted by Evildeffy View Post
    Funny how in the Horizon event it was still winning against the 6900K with superior clocks and quad channel memory and still does.
    Cinebench cares not for Dual/Quad channel memory, results are still the same, don't see the point of you posting a correct score still.
    ? 1800X (1700 boosts higher aswell) is higher clocked than a stock 6900K. OC'd 6900K smokes OC'd 1800X/1700.

    Quote Originally Posted by Evildeffy View Post
    Funny thing is that was a press slide that was leaked, if you read the actual slide you'll see that they hid nothing and explicitly gave system details to the press there.
    Also funny is how both a 6900K was pitted against a Ryzen 7 1800X in the horizon event with the same specs, only CPU difference, and it still beat the 6900K without hamstrings.
    You want to talk about leaks yet fail to include launch event data as well where there was no deception.
    Or were the results (that you could test for yourself) that they showed entirely fake even though the entire community supported it?
    The configuration was found out after the press event, those configuration slides didnt list anything. Noone could actually touch any hardware at launch event. Everyone said that their testing methodology was bogus and they focused too much on Cinebench, which is useless.

    Quote Originally Posted by Evildeffy View Post
    It's showing the 6900K above the R7 1700 and R7 1700X and below the R7 1800X in performance but not value, there's nothing wrong with that slide, if anything it's gotten better with price cuts.
    Or are you telling me right now straight to everyone's faces that Intel has the better price/performance ratio?

    I'd like to know the answer to that question.

    Answer with logic and facts, no conjecture.
    And if you can't do that then don't answer at all.
    Yes, it's showing 6900K on par with R7 1700 in performance, which is stupidly misleading. 6900K is expensive but it smokes any R7. I'm not saying shit about Ryzen price/performance, it's good, but their presentation should've been only about that.

    You wanted to know why there is hype. That's why. Noone cares about performance difference in Cinebench, noone plays Doom or Ashes of the Singularity. People need to know how products perform in the usecase the products is designed for.
    Last edited by Thunderball; 2017-07-21 at 10:43 PM.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  6. #546
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thunderball View Post
    G92 was used in 8800M and 9800M, both suffered from die separation, people who repair notebooks order them in bulk.
    Not the same G92 chip, it is a heavily neutered and smaller die.
    It was derived from G92, not actually being G92, which btw was rebranded 4 times (I checked).

    Quote Originally Posted by Thunderball View Post
    AMD example: HD 7970 > HD 8970 > R9 280X. To be fair, 8000 series were a rebrand as a whole, but still unacceptable in my opinion. Nvidia's 600-700 series trick was pretty disgusting aswell, just for the reference. I'd personally like to see stuff similar to Nvidia's 900 and 1000 series where (almost) every card is based on a current generation GPU. Just sell older GPUs as they were released if you think that they still have a place on the market.
    You are bringing up an Asian only example of the HD8970 that was brought in at the same time for OEMs ONLY as you couldn't ever buy these in stores.
    You can only find these on eBay if you're lucky or in mobile MXM3 format (again.. OEM only, no consumer can buy it from a store).

    For every normal consumer you only had an HD7970 or an R9 280X... so try again.

    Also "technically" the entire 900 series was a rebrand at a whole because the original Maxwell was the GTX 750Ti.

    Quote Originally Posted by Thunderball View Post
    We have enough to make an educated guess. You dont need to be completely right here to gauge RX Vega's market position: it's not like Nvidia has a shitton of cards with a small interval in performance between them. I'm personally not interested in anything Nvidia or AMD releases in the near future as I already have a Pascal card. I want to see how things play out, and success of the architecture very strongly influences the development cycle that follows. Obviously I want competition, I want cards to cost lower, but as it is right now AMD doesnt help themselves or the market at all.
    Guessing != Knowing.
    Did you know/guess the original GTX Titan would be flat out worse than a GTX 780Ti? Or that both cards would be artificially neutered to disable DP for developers?
    Same principle.

    Estimated guesses are fine and all but like "Casey Ryback" said ... Assumption is the mother of all fuck-ups.

    Quote Originally Posted by Thunderball View Post
    Things that you assume I'd call you an AMD fanboy is the opposite thing of why I would do that. That's called twisted logic. If one's logic is like that there is no use to it: you'd be drawing useless conclusions all the time. There is no way I can make a scientific opinion of RX Vega because I wont be getting a card for myself. Sure, I might get a chance to play with it but it's hardly enough to draw any conclusions. What you're doing here is manipulating information to fit your personal opinion, which why I'm calling you an AMD fanboy. The whole memory type discussion is a prime example of that.

    I wouldnt all a person who buys inferior products due to brand loyalty a fanboy. I would call him an idiot. You're obviously not. That's why you probably know that you're wrong, that's why you end up trying to pinpoint minor flaws in my statements instead of discussing the matter. If you want to get a card and test it for yourself go on, and dont participate in the discussion. That's your only way of getting the facts.
    Right .. I link articles with info and you state that you read something entirely different from it and nothing else.
    Yes a memory controller discussion has oh so much to do with AMD (the hell?) ... I've answered your questions with links that you read entirely different and all you come up with is a "No, your article says so" ... how about you turn around and provide me with specifics.
    Hell I've even stated that if any nVidia Volta card (not rebrand) will use GDDR5(X) I would come out and admit my fault if you were willing to do the same.
    That however doesn't change the fact that a memory controller, regardless of brand, has NOTHING WHATSOEVER to do with being a so-called fanboy.

    However... you are allowed to believe whatever you want, it doesn't make it a fact... also you call me using twisted logic and fail to look at yourself as you're using insane logic to call someone a fanboy by using an explanation that is perfectly logical (GDDR6 availability and the official announcement from SK Hynix during Computex with nVidia's Volta not being available till 2018) as a reason.

    Good job there... really, I feel like I'm talking to Life-Binder all over again and funny how my estimations of those discussions turned out true.

    Quote Originally Posted by Thunderball View Post
    ? 1800X (1700 boosts higher aswell) is higher clocked than a stock 6900K. OC'd 6900K smokes OC'd 1800X/1700.
    Except for the following, since I know you've not watched nor read about the event, they let the i7-6900K rip fully with all Turbo Boost modes enabled vs. a fixed clock of 3,4GHz for the Ryzen 7 chip, no boosting present.
    Yes a heavily OC'ed 6900K will beat the 1800X ... barely and it requires 300 - 400MHz to do so above equal clocks.
    Does that change the fact the actual stock performance is correct as you saw it? No it does not.
    Also no ... the R7 1700 does NOT boost higher than an i7-6900K, in fact if Turbo Boost 3.0 is enabled (which it was) the 6900K boosts 1 core to 4,0GHz and the rest to 3,7GHz .. so try again on that as well.

    Quote Originally Posted by Thunderball View Post
    The configuration was found out after the press event, those configuration slides didnt list anything. Noone could actually touch any hardware at launch event. Everyone said that their testing methodology was bogus and they focused too much on Cinebench, which is useless.
    Funny how people who actually use those programmes like Cinebench, Blender, HandBrake (till the AVX extensions were enabled) like the Tech Press actually corroborated that story and testing, but I guess the Prosumers (hate that word) don't matter and only gamers do right?
    However that information was still not meant for the public and it was known to the press there.
    Look at the picture again with the gaming benchmarks you so nicely linked and look properly what it says bottom right.

    No public event has had this "normalization" ... only the NDA press events, wonder why that is.

    Quote Originally Posted by Thunderball View Post
    Yes, it's showing 6900K on par with R7 1700 in performance, which is stupidly misleading. 6900K is expensive but it smokes any R7. I'm not saying shit about Ryzen price/performance, it's good, but their presentation should've been only about that.

    You wanted to know why there is hype. That's why.
    Look again CAREFULLY, the i7-6900K is positioned ABOVE both the R7 1700 and R7 1700X and below the R7 1800X in composite performance.
    Would you like me to magnify it for you so you can properly see it? It's really not that difficult.

    The text throughout the slide is from the guy who leaked it, people who sat there obviously didn't see that text.

    Also the presentation should only have been about performance? What company has ever done so?
    Hell Intel has even started shit-slinging towards AMD's EPYC platform (4 glued-together desktop dies anyone?) and they always bring up price/performance, especially since the FX series which is very funny since some of their current HCC dies use the same technology as well as their Core2 Quad architecture did.

    Every presentation has that information and not including your strengths in a presentation... well... have you ever solicited for a highly contested job before?

    The overhype is created by people, AMD started the original hype and whilst they didn't "stop" it they never lied about it.
    And in all honesty neither AMD, nVidia nor Intel would stop overhype, nor should they as it increases the chance of sales and it is not their responsibility if the consumers are idiots just like the BMW drivers themselves are responsible if they kill people in a crash and not BMW themselves.

    I am dead serious BTW, I will magnify it for you and show it to you clearly if you so desire.
    I would've linked a "clean" version (without the name)... unfortunately I couldn't find any.
    Last edited by Evildeffy; 2017-07-22 at 12:02 AM.

  7. #547
    Quote Originally Posted by Evildeffy View Post
    Not the same G92 chip, it is a heavily neutered and smaller die.
    It was derived from G92, not actually being G92, which btw was rebranded 4 times (I checked).
    What's your point? That it's a record? It's not, GF108, Cape Verde and Cedar have been rebranded for 5 times, and for Cape Verde it's seems like it could be more. It's completely normal to rebrand lowend GPUs. Also, G92 and G92b are quite different: Nvidia improved temperatures while fixing the die separation problem.

    Quote Originally Posted by Evildeffy View Post
    Also "technically" the entire 900 series was a rebrand at a whole because the original Maxwell was the GTX 750Ti.
    Original Maxwell was GM100.

    Quote Originally Posted by Evildeffy View Post
    Guessing != Knowing.
    Did you know/guess the original GTX Titan would be flat out worse than a GTX 780Ti? Or that both cards would be artificially neutered to disable DP for developers?
    Same principle.

    Estimated guesses are fine and all but like "Casey Ryback" said ... Assumption is the mother of all fuck-ups.
    There is logic behind the original Titan move. There is no logic in artificially crippling Vega FE. Unless AMD tries to help Nvidia sell their cards.

    Even if there is a scenario like that only makes my assumptions the best case scenario.

    Quote Originally Posted by Evildeffy View Post
    Right .. I link articles with info and you state that you read something entirely different from it and nothing else.
    Yes a memory controller discussion has oh so much to do with AMD (the hell?) ... I've answered your questions with links that you read entirely different and all you come up with is a "No, your article says so" ... how about you turn around and provide me with specifics.
    Hell I've even stated that if any nVidia Volta card (not rebrand) will use GDDR5(X) I would come out and admit my fault if you were willing to do the same.
    That however doesn't change the fact that a memory controller, regardless of brand, has NOTHING WHATSOEVER to do with being a so-called fanboy.

    However... you are allowed to believe whatever you want, it doesn't make it a fact... also you call me using twisted logic and fail to look at yourself as you're using insane logic to call someone a fanboy by using an explanation that is perfectly logical (GDDR6 availability and the official announcement from SK Hynix during Computex with nVidia's Volta not being available till 2018) as a reason.

    Good job there... really, I feel like I'm talking to Life-Binder all over again and funny how my estimations of those discussions turned out true.
    Stop twisting arguments. The argument was about if Nvidia can actually use GDDR5X instead of GDDR6 if they need it, or the economical side of it. Everything you linked and said suggested that it was possible and plausible. We were not discussing Volta or GDDR6 release dates, we cannot assume anything about that simply because people making them probably dont know themselves.

    SK Hynix said NOTHING about Volta, you're assuming here. Nvidia have not used Hynix memory with Pascal, they used Micron's GDDR5X and GDDR5 from Micron and Samsung. Micron are also launching their GDDR6 later this year/early next year.

    Quote Originally Posted by Evildeffy View Post
    Except for the following, since I know you've not watched nor read about the event, they let the i7-6900K rip fully with all Turbo Boost modes enabled vs. a fixed clock of 3,4GHz for the Ryzen 7 chip, no boosting present.
    Yes a heavily OC'ed 6900K will beat the 1800X ... barely and it requires 300 - 400MHz to do so above equal clocks.
    Does that change the fact the actual stock performance is correct as you saw it? No it does not.
    Also no ... the R7 1700 does NOT boost higher than an i7-6900K, in fact if Turbo Boost 3.0 is enabled (which it was) the 6900K boosts 1 core to 4,0GHz and the rest to 3,7GHz .. so try again on that as well.
    We both know how valid those "tests" are. There are plenty of reviews. Is the stock 1800X faster than 6900K in Cinebench? It is, but that's not what AMD slides implied. Again, you wanted to know where the hype comes from. We are NOT discussing Ryzen performance here.

    Quote Originally Posted by Evildeffy View Post
    Look again CAREFULLY, the i7-6900K is positioned ABOVE both the R7 1700 and R7 1700X and below the R7 1800X in composite performance.
    Would you like me to magnify it for you so you can properly see it? It's really not that difficult.

    The text throughout the slide is from the guy who leaked it, people who sat there obviously didn't see that text.
    I dont care about 1700 and 1700X, my point is that 6900K positioned lower than a 1800X. It is a stronger CPU than a 1800X. Or is it that only Cinebench matters? Or AMD creating hype?

    Quote Originally Posted by Evildeffy View Post
    Also the presentation should only have been about performance? What company has ever done so?
    Hell Intel has even started shit-slinging towards AMD's EPYC platform (4 glued-together desktop dies anyone?) and they always bring up price/performance, especially since the FX series which is very funny since some of their current HCC dies use the same technology as well as their Core2 Quad architecture did.

    Every presentation has that information and not including your strengths in a presentation... well... have you ever solicited for a highly contested job before?

    The overhype is created by people, AMD started the original hype and whilst they didn't "stop" it they never lied about it.
    And in all honesty neither AMD, nVidia nor Intel would stop overhype, nor should they as it increases the chance of sales and it is not their responsibility if the consumers are idiots just like the BMW drivers themselves are responsible if they kill people in a crash and not BMW themselves.

    I am dead serious BTW, I will magnify it for you and show it to you clearly if you so desire.
    I would've linked a "clean" version (without the name)... unfortunately I couldn't find any.
    We are not discussing other parts of it, noone ever hypes the technical stuff. We are also not discussing Intel's stupid release about EPYC (name is also incredibly dumb, especially for a server chip).

    As an example. Have you noticed any hype about X299 release? I didnt, and that probably because Intel didnt say SHIT, and they are expected not to say SHIT. That's why that EPYC release is very surprising to people. There is some hype about Coffee Lake though, but I guess that's very much linked to Ryzen hype.

    Vega is unfortunately already destined to fail the hype expectations.
    Last edited by Thunderball; 2017-07-22 at 01:22 AM.
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  8. #548
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thunderball View Post
    What's your point? That it's a record? It's not, GF108, Cape Verde and Cedar have been rebranded for 5 times, and for Cape Verde it's seems like it could be more. It's completely normal to rebrand lowend GPUs. Also, G92 and G92b are quite different: Nvidia improved temperatures while fixing the die separation problem.
    Their "High-end rebrand" which you so fault AMD for as an example, you want to blame one but not the other?
    Funny how you described the G92 as low-end since it only became that with the last jump to GT250 and not before.

    Quote Originally Posted by Thunderball View Post
    Original Maxwell was GM100.
    Wrong again, GM107 and 108 were which were identified as the GTX 750 and GTX 750Ti.
    There was never a commercial GM100 chip, Maxwell was prototyped as GM107/108 only.

    Quote Originally Posted by Thunderball View Post
    There is logic behind the original Titan move. There is no logic in artificially crippling Vega FE. Unless AMD tries to help Nvidia sell their cards.

    Even if there is a scenario like that only makes my assumptions the best case scenario.
    Sure the logic is there, nVidia wanted more money if you were using DP, there was no technical reason to.
    Also no-one, at any point in time, suggested crippled Vega FE, what was suggested was rushed launch, broken drivers, firmware immaturity etc.
    Not all that uncommon, neither of which is calling out to wait and see what they have in under 2 weeks instead of assuming it's 100% shit regardless of how much of an educated guess you may throw at it.

    Quote Originally Posted by Thunderball View Post
    Stop twisting arguments. The argument was about if Nvidia can actually use GDDR5X instead of GDDR6 if they need it, or the economical side of it. Everything you linked and said suggested that it was possible and plausible. We were not discussing Volta or GDDR6 release dates, we cannot assume anything about that simply because people making them probably dont know themselves.

    SK Hynix said NOTHING about Volta, you're assuming here. Nvidia have not used Hynix memory with Pascal, they used Micron's GDDR5X and GDDR5 from Micron and Samsung. Micron are also launching their GDDR6 later this year/early next year.
    Granted I did assume because of the fact of them naming a major graphics card company (there's only 2, technically 3 counting Intel but they don't use GDDR) and out of those 2 who's more likely to use GDDR6?
    AMD, who's using HBM2 and is actively developing for it? Or nVidia who's actively developing for GDDR5(X) and higher?
    However no argument twisting is to be had here as me claiming that it would be highly expensive (and admittedly MULTIPLE TIMES STATED) but not impossible for them to use GDDR5(X) but I did state it'd either be GDDR5(X) or GDDR6, not both.
    This still has 0 to do with being an "AMD fanboy" so in regards to twisted logic... "Hypocrisy, thy name is Thunderball".

    Also nVidia (or better stated their AIBs) used GDDR5 from all 3 vendors in total on their GTX 10 series: Samsung, Micron and SK Hynix.
    In fact the SK Hynix chips were known to absolutely suck balls in mining scenarios where both Samsung and Micron did better.

    nVidia does not dictate which GDDR5 to use in partner cards, the partners themselves do, nVidia can only decide the memory vendors on their own reference design cards .. oh wait sorry .. Founder's Edition cards.

    As far as GDDR5X goes... well it's kinda hard considering that GDDR5X is produced by only 1 player, Micron, and no-one else.
    Look at that... another reason for GDDR6 advancement and not GDDR5(X) .. competition and prices for nVidia.

    Quote Originally Posted by Thunderball View Post
    We both know how valid those "tests" are. There are plenty of reviews. Is the stock 1800X faster than 6900K in Cinebench? It is, but that's not what AMD slides implied. Again, you wanted to know where the hype comes from. We are NOT discussing Ryzen performance here.
    Ok so... leaked slides that should never have reached public are AMD's creation of the hype where actually showing it at their actual events which are meant for the public is not and by no means an indicator of their performance?
    Including by releasing the same "test" you so abhor into the wild for people to do their own math and confirm it.
    It's funny how dismissive this has all of a sudden become with your arguments being refuted.

    Quote Originally Posted by Thunderball View Post
    I dont care about 1700 and 1700X, my point is that 6900K positioned lower than a 1800X. It is a stronger CPU than a 1800X. Or is it that only Cinebench matters? Or AMD creating hype?
    Really? You were heavily using it as one of your arguments before for hype creating and now it's no longer cared about?
    The 6900K is positioned as slightly slower because it is in all actuality, Cinebench is but 1 result, there are others.
    In fact the 1800X is pretty damn potent enough to encroach on the 6950X territory at some points, there's no lie or fake hype about something that's in actuality the truth.

    Quote Originally Posted by Thunderball View Post
    We are not discussing other parts of it, noone ever hypes the technical stuff. We are also not discussing Intel's stupid release about EPYC (name is also incredibly dumb, especially for a server chip).
    Really? Intel seems to love hyping their new "Mesh" core communication design .. or the fact that they have finally built a monolithic die for their 28C behemoth... but I guess those aren't technical details...
    However on the name... I do agree.

    Quote Originally Posted by Thunderball View Post
    As an example. Have you noticed any hype about X299 release? I didnt, and that probably because Intel didnt say SHIT, and they are expected not to say SHIT. That's why that EPYC release is very surprising to people. There is some hype about Coffee Lake though, but I guess that's very much linked to Ryzen hype.
    Actually yes there was quite a bit of hype and like you could see a lot of controversy and failure for the X299 launch, not to mention it being drowned out entirely by the showing of the X399 boards for ThreadRipper.
    Also X299 cannot be compared to EPYC (Naples) as Intel's counterpart to(the Xeon chips) use LGA3647 and not LGA2066 like X299 does.

    The hype Intel tried to create was solely and utterly destroyed and drowned out by what AMD is set to release, which in a way worked out for them better as if there'd be more attention to the failed power delivery design of X299 ,or the insanely stupid amount of power draw from their CPUs where Intel has actually classed their own CPUs as lower TDP than they are factually putting out, would be way more devastating than it already is.

    Quote Originally Posted by Thunderball View Post
    Vega is unfortunately already destined to fail the hype expectations.
    Very likely yes, doesn't change it'd be a better idea to wait and see for 1,5 weeks more instead of arguing about it since by that point all facts will be revealed.
    Last edited by Evildeffy; 2017-07-22 at 06:32 AM. Reason: Linkfix

  9. #549
    And the wall of text argument continues.....who will win.....

    Answer: The moderator who closes this shit thread...

  10. #550
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,863
    Don't know, they probably argue whether Vega is a flop or a mega-flop.

    The usual - "look what happened back in the ancient times nobody cares about anymore!" wall of text bickering.

  11. #551
    I wonder if Vega will affect Navi


    afaik, Navi itself isnt a major new arch/big redesign, it was supposed to combine Vega chips (or tweaked Vega chips) in MCMs on Infinity fabric ala Ryzen to get around have to continue to make huge expensive mono dies

    but for that to succeed - the chips that they "glue" together have to be good and preferabbly efficient too ..

  12. #552
    Quote Originally Posted by Life-Binder View Post
    I wonder if Vega will affect Navi


    afaik, Navi itself isnt a major new arch/big redesign, it was supposed to combine Vega chips (or tweaked Vega chips) in MCMs on Infinity fabric ala Ryzen to get around have to continue to make huge expensive mono dies

    but for that to succeed - the chips that they "glue" together have to be good and preferabbly efficient too ..
    Depends on the success of Vega. If it fails, who knows maybe AMD will drop out of the PC gaming card market all together and focus on mining cards and console graphics?

  13. #553
    Where is my chicken! moremana's Avatar
    15+ Year Old Account
    Join Date
    Dec 2008
    Location
    Florida
    Posts
    3,618
    Quote Originally Posted by Bigvizz View Post
    Depends on the success of Vega. If it fails, who knows maybe AMD will drop out of the PC gaming card market all together and focus on mining cards and console graphics?


    Made me smile.

  14. #554
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,863
    Quote Originally Posted by Bigvizz View Post
    Depends on the success of Vega. If it fails, who knows maybe AMD will drop out of the PC gaming card market all together and focus on mining cards and console graphics?
    That's not happening. AMD do have a decent share of the market and can contend well for the more budget conscious spot, besides they have their gaming-focused semi-custom shit going on with consoles, so they got to develop that technology regardless.

    It's just that I don't see them competing with Nvidia in high end - Nvidia is just on a whole other level now, but AMD will do just fine staying in mid-range/low-end, especially empowered by their constant "rebels" bullshit going on and throwaway prices.
    Last edited by Gaidax; 2017-07-23 at 06:25 PM.

  15. #555
    Quote Originally Posted by Gaidax View Post
    That's not happening. AMD do have a decent share of the market and can contend well for the more budget conscious spot, besides they have their gaming-focused semi-custom shit going on with consoles, so they got to develop that technology regardless.

    It's just that I don't see them competing with Nvidia in high end - Nvidia is just on a whole other level now, but AMD will do just fine staying in mid-range/low-end, especially empowered by their constant "rebels" bullshit going on and throwaway prices.
    Shrug, if you say so.

    Last edited by Bigvizz; 2017-07-24 at 04:23 AM.

  16. #556
    Quote Originally Posted by Bigvizz View Post
    Shrug, if you say so.
    consoles / mobile

    vs tiny pc market.
    Disarm now correctly removes the targets’ arms.

  17. #557
    Quote Originally Posted by Pickynerd View Post
    consoles / mobile

    vs tiny pc market.
    Like I said AMD will stick to that market, nothing you said is any different from what I said.

  18. #558
    Quote Originally Posted by Bigvizz View Post
    Like I said AMD will stick to that market, nothing you said is any different from what I said.
    I have a buzz, everything I say is different
    Disarm now correctly removes the targets’ arms.

  19. #559
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,863
    Quote Originally Posted by Bigvizz View Post
    I think you don't realize that much of that is because Nvidia simply has shitton more supply.

    AMD simply has no resources to put up the same stock as Nvidia, after all it's no secret that Polaris is sold out.

  20. #560
    Warchief Zenny's Avatar
    10+ Year Old Account
    Join Date
    Oct 2011
    Location
    South Africa
    Posts
    2,171
    Quote Originally Posted by Pickynerd View Post
    consoles / mobile

    vs tiny pc market.
    Tiny PC market? In hardware terms? I think Nvidia has shipped more GPU's then PS4's sold in the past 3.5 years since the consoles launch.

    - - - Updated - - -

    Quote Originally Posted by Gaidax View Post
    I think you don't realize that much of that is because Nvidia simply has shitton more supply.

    AMD simply has no resources to put up the same stock as Nvidia, after all it's no secret that Polaris is sold out.
    The picture wasn't much different before the latest mining craze, which has hit Nvidia pretty hard as well.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •