Page 67 of 95 FirstFirst ...
17
57
65
66
67
68
69
77
... LastLast
  1. #1321
    I am curious though what AMD has done differently compared to Intel so they can make their product so much cheaper. Right now the best case is the same level of performance but at a much lower cost.

    Intel has some high profit margins in certain area's that's for sure but even so that's usually focused on non consumer markets

  2. #1322
    Quote Originally Posted by ati87 View Post
    I am curious though what AMD has done differently compared to Intel so they can make their product so much cheaper. Right now the best case is the same level of performance but at a much lower cost.

    Intel has some high profit margins in certain area's that's for sure but even so that's usually focused on non consumer markets
    AMD decided not to gouge us because they don't have a monopoly. It's pretty much that simple. For the past several years intel has basically had a monopoly, so they have slowly but surely been raising their prices to increase profits. AMD is probably spending about the same amount as intel on R&D and production, they simply are not as greedy. That's really all there is to it.

  3. #1323
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Lathais View Post
    AMD decided not to gouge us because they don't have a monopoly. It's pretty much that simple. For the past several years intel has basically had a monopoly, so they have slowly but surely been raising their prices to increase profits. AMD is probably spending about the same amount as intel on R&D and production, they simply are not as greedy. That's really all there is to it.
    You're forgetting the fact that the cores are smaller than Intel's as well as the fact that there's no integrated GPU, cutting the die down immensely.
    Intel's GPU on the die is about HALF the entire die size on Kaby Lake.

    Factors which attribute to production costs immensely since the dies are inherently smaller it means that you can roughly double the amount of dies on the same wafer meaning lower production costs and more profit.

    This is exactly why I believe (as I've said from the start) that Ryzen 3 will be the first "real" Quad-Core of the Ryzen range rather than Ryzen 5.
    It's entirely possible and plausible that Ryzen 3 will also become Ryzen 5's quad core models as it'd be cheaper to produce.

    This is why Ryzen 3's production is so much later than Ryzen 5 because it's actually a different die and will be used (my personal observation of things) for both Ryzen 3's 4C/4T models as well as Ryzen 5's 4C/8T models... called Ryzen 5 1501X or something.
    The reason for this choice is because of the APUs that will be coming out at the same time pretty much, AMD knows that anyone that buys an APU doesn't care for top-of-the-line graphics or CPU power so the APUs are limited to 4C/8T with a Vega based integrated GPU.

  4. #1324
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Lathais View Post
    AMD decided not to gouge us because they don't have a monopoly. It's pretty much that simple. For the past several years intel has basically had a monopoly, so they have slowly but surely been raising their prices to increase profits. AMD is probably spending about the same amount as intel on R&D and production, they simply are not as greedy. That's really all there is to it.
    Not so much that AMD decided not to gouge us, as much as AMD has a huge hill to climb to win back consumers. If AMD's chips were equally priced to Intel, who would buy them? Especially when the X99 has had all its problems fixed, while AM4 is just beginning to resolve some of it's own. Then you have someone like Life-Binder who thinks Intel is the best thing ever since sliced bread, and AMD has an uphill battle on its hands.

    Actually if you think about the price of the 1700 vs the FX-8350, AMD's new chips are very overpriced compared to their own products. The FX 8150 and 8350 were released for $200, while the 1700 is $329. Bulldozer was just as hyped, and had just as long of a development time to RyZen. Technically they were meant to be server CPUs, but were instead sold to consumers. In fact you can use ECC ram for both Bulldozer and RyZen chips. AMD's pricing is higher than usual because Intel had pretty much setup a sweat market that they can also gouge.

    So long as Intel keeps their prices high, so will AMD. It's not until Intel drops their prices, will AMD even consider dropping theirs.

    Quote Originally Posted by Evildeffy View Post
    You're forgetting the fact that the cores are smaller than Intel's as well as the fact that there's no integrated GPU, cutting the die down immensely.
    Intel's GPU on the die is about HALF the entire die size on Kaby Lake.
    You know it's weird that Intel keeps putting GPUs in their chips, when nobody asks for them. Seriously, who wants Intel graphics? Though I've put together a 6600K machine using only Intel graphics, and very surprised with the results, but I could easily buy a $100 graphics card that'll do a better job.

  5. #1325
    Then you have someone like Life-Binder who thinks Intel is the best thing ever since sliced bread
    only for gaming


    for workstation your precious Ryzen rules supreme

  6. #1326
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    AMD doesn't spend nearly as much in R&D as Intel. Most of Intel's R&D goes to their foundries to improve their manufacturing processes. Intel also tries to compete in multiple other markets, even if they almost always fail, which also makes them spend more money.

    AMD is a company that only designs nowadays, they don't even need to spend as much as Intel to be able to compete. As long as the design is good and there's a competent company in the market available to manufacture it.

  7. #1327
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Dukenukemx View Post
    Actually if you think about the price of the 1700 vs the FX-8350, AMD's new chips are very overpriced compared to their own products. The FX 8150 and 8350 were released for $200, while the 1700 is $329. Bulldozer was just as hyped, and had just as long of a development time to RyZen. Technically they were meant to be server CPUs, but were instead sold to consumers. In fact you can use ECC ram for both Bulldozer and RyZen chips. AMD's pricing is higher than usual because Intel had pretty much setup a sweat market that they can also gouge.

    So long as Intel keeps their prices high, so will AMD. It's not until Intel drops their prices, will AMD even consider dropping theirs.
    That's because R&D costs have gone up, that much is actually true but having said that.. over the course of time Intel went from 200 USD from the first Core i7 to now 350 - 400 USD for the consumer Core i7.

    Prices go up, especially if you have no competitor.. but there is a cost attached to it which companies do need to earn back for investing in said architecture.

    So I wouldn't call it overpriced at all.. but more like a consequence of technological advancement in an era where economy has stagnated.

    Also they weren't just meant as server CPUs, AMD has had the ability to support ECC RAM for a long time.
    The Athlon64 CPUs were already capable of doing so (possible that it was enabled in earlier Athlon CPUs as well, not sure) because AMD believed that ECC RAM usage should be the choice of the user rather than choice of the CPU builder.

    Unregistered ECC RAM was never meant to be for server only, it just happened to go that way because of how Intel fucked the market up when it was rising.

    Also due to issues... Bulldozer went through several development cycles BTW, the architecture was redesigned multiple times and in the end was chosen to build it the easier way.. I remember reading an article then that an engineer who was fired from AMD said some automated tool was used to design the Bulldozer architecture which could have been 20% smaller and cooler and more power efficient had it been designed manually instead.
    But the CEO at the time refused to work like that, this is also the same CEO that made AMD fall even further from it's shitty position as well.
    (No idea what exactly he meant by this and I no longer have the source for this, I went looking for it like half a year ago but never found it... this was about 2 years before the launch of Bulldozer)

    Quote Originally Posted by Dukenukemx View Post
    You know it's weird that Intel keeps putting GPUs in their chips, when nobody asks for them. Seriously, who wants Intel graphics? Though I've put together a 6600K machine using only Intel graphics, and very surprised with the results, but I could easily buy a $100 graphics card that'll do a better job.
    There are definite benefits to having an iGPU in terms of diagnostics for PCs but can easily be managed without it.

    That said I do believe Intel should separate it like AMD does.
    An i7-7700K without an iGPU installed would decrease complexity, reduce power consumption, be cheaper and probably increase overclocking further.

    That said .. Intel does this for a multitude of reasons:
    1. They want at least a bite of the GPU market, this will reduce income to AMD and nVidia.
    2. They want to control the "eco-system" they set up.*
    3. It's easier and cheaper to develop 1 line than it is to segregate and develop 2 lines which adds both development and validation time.
    4. The absolute vast majority of consumers are idiots and have no clue what any of it would mean so in order to make it easier they chose this path.
    5. etc. etc.

    *: The reason you cannot overclock baseclocks with Intel is because it affects the GPU on the die.
    SuperMicro first exposed a vulnerability to have the ability to completely shut off the GPU part of the die and overclock Skylake i3/i5's which were locked.
    This was found out and Intel forced a mandatory BIOS update to all AIBs for their chipsets, so now users NEED a K chip to overclock.
    This is to force you consumers to make a choice of what they want and have an easier time with service also the ability to charge you a metric fuckton more for the K series models.
    (Sidenote: this was also the original way of overclocking CPUs in the first place.)

    Think about it for a minute.. all Intel CPUs are based on a singular die with cores/caches/hyperthreading completely disabled for whatever reason.
    Pentium, Intel Core i3, i5 and i7 (consumer variants) are all the exact same die.
    (just with inherent defects or tolerances which then change the "model" of CPU it will end up as, simple microcode change or laser cutting parts of the die)

    Why is the Core i5-7600K worth 100+ USD more? Simply because it has HyperThreading.
    A feature that is present on the i5 series just entirely disabled on the microcode/"firmware" on the CPU itself.
    Both have (as far as Silicon Lottery allows) identical potential, power draw etc. etc.

    Intel has made quite a bit of money with this but again they also paid for the research, development and validation of not just the CPU itself but also the chipsets.
    We can always go back to VIA and nForce chipsets ... anyone remember what nightmare that was?

    There are always multiple sides to a story,

  8. #1328
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,780
    In your gaming mindset you miss the point of GPU on die, it's there for all those PCs which are used for things other than gaming - actual work. So yeah, you may see no value in Intel CPU having GPU, but in your workplace that runs tens to thousands of the PCs of various specification, that iGPU is what drives their displays just fine.

    Additionally as a side bonus, this has virtually eliminated entry level dedicated GPUs and made Nvidia/AMD up their game a notch.

    So for me, I could not care less about iGPU there at home, but at work? I don't need dGPU simply because Intel got me sorted.

    They give you this GPU with 7700K likely because it is so tightly integrated in architecture that it would probably be more pain in the ass for them to do CPU without it at this point, that is besides the point that this helps to fill professional needs, which often involve staring at the bunch of text and shit on multiple monitors which this iGPU accomplishes just fine.
    Last edited by Gaidax; 2017-03-21 at 06:59 PM.

  9. #1329
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Gaidax View Post
    In your gaming mindset you miss the point of GPU on die, it's there for all those PCs which are used for things other than gaming - actual work. So yeah, you may see no value in Intel CPU having GPU, but in your workplace that runs tens to thousands of the PCs of various specification, that iGPU is what drives their displays just fine.

    Additionally as a side bonus, this has virtually eliminated entry level dedicated GPUs and made Nvidia/AMD up their game a notch.

    So for me, I could not care less about iGPU there at home, but at work? I don't need dGPU simply because Intel got me sorted.

    They give you this GPU with 7700K likely because it is so tightly integrated in architecture that it would probably be more pain in the ass for them to do CPU without it at this point, that is besides the point that this helps to fill professional needs, which often involve staring at the bunch of text and shit on multiple monitors which this iGPU accomplishes just fine.
    Pretty much what I said... also the exclusion to the rule are stock brokers and other news assorted outlets that require 3+ monitors to function!

    As a general note before someone comments on that:
    HTPC/Self-built router/Self-built NAS users are not really in question for this since they aren't in the budget class we're discussing.
    We're stating the Medium - High-end CPU categories here.

  10. #1330
    Deleted
    Jayztwocents is streaming & playing using a Ryzen 1800X, so you can see what it can/can't do.
    Although admittedly he's also using a Titan X too
    https://www.youtube.com/watch?v=VL2TAJvfiQU

  11. #1331
    Deleted
    Quote Originally Posted by Gaidax View Post
    In your gaming mindset you miss the point of GPU on die, it's there for all those PCs which are used for things other than gaming - actual work. So yeah, you may see no value in Intel CPU having GPU, but in your workplace that runs tens to thousands of the PCs of various specification, that iGPU is what drives their displays just fine.

    Additionally as a side bonus, this has virtually eliminated entry level dedicated GPUs and made Nvidia/AMD up their game a notch.

    So for me, I could not care less about iGPU there at home, but at work? I don't need dGPU simply because Intel got me sorted.

    They give you this GPU with 7700K likely because it is so tightly integrated in architecture that it would probably be more pain in the ass for them to do CPU without it at this point, that is besides the point that this helps to fill professional needs, which often involve staring at the bunch of text and shit on multiple monitors which this iGPU accomplishes just fine.
    Errrr, in a professional workspace, they would be using the X99 platform, which doesn't have a IGPU....

    You also realise AMD also do APUs which is aimed at the segment you are referring to right? Basic office work can run on AMD or Intel just fine, the GPU on the K series is pointless, and where is your evidence that the GPU is tightly integrated considering the X99 platform in its entirety doesn't have a IGPU even for the likes of Broadwell - E which the broadwell desktop had additional level 4 cache for the GPU on the chip.

  12. #1332
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thorianrage View Post
    Errrr, in a professional workspace, they would be using the X99 platform, which doesn't have a IGPU....

    You also realise AMD also do APUs which is aimed at the segment you are referring to right? Basic office work can run on AMD or Intel just fine, the GPU on the K series is pointless, and where is your evidence that the GPU is tightly integrated considering the X99 platform in its entirety doesn't have a IGPU even for the likes of Broadwell - E which the broadwell desktop had additional level 4 cache for the GPU on the chip.
    The vast majority of office users do not use LGA2011 mobos and chips but actually the LGA115X mobos and chips, that's what he's referring to.
    And yes AMD APUs serve this function perfectly fine as well.

    In general I agree that iGPUs are pointless on K models but there are still benefits and it's easier and cheaper to make a "1 size fits all" mold.

  13. #1333
    Deleted
    Quote Originally Posted by Evildeffy View Post
    The vast majority of office users do not use LGA2011 mobos and chips but actually the LGA115X mobos and chips, that's what he's referring to.
    And yes AMD APUs serve this function perfectly fine as well.

    In general I agree that iGPUs are pointless on K models but there are still benefits and it's easier and cheaper to make a "1 size fits all" mold.
    Office users systems vary massively, I've seen companies use cloud based systems or really old hardware, I've yet to see a office utilise sandybridge or above, because they are not cheap and tend to be overkill for the use, offices go for the cheapest solution, sometimes an old CPU + cheap dedicated GPU is still a cheaper option.

  14. #1334
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thorianrage View Post
    Office users systems vary massively, I've seen companies use cloud based systems or really old hardware, I've yet to see a office utilise sandybridge or above, because they are not cheap and tend to be overkill for the use, offices go for the cheapest solution, sometimes an old CPU + cheap dedicated GPU is still a cheaper option.
    I've come across plenty including full blown i7-4790K and 6700Ks using their iGPU (and earlier gens of course)... doing nothing but basic office work of Word, Excel and Outlook.
    Highly infuriating and wasteful (little bit of envy) but it's the choice of the company.

    But no.. nowadays it's not cheaper to have it separate, only if the hardware they are using is ancient and they are wanting to extend it's lifetime.

  15. #1335
    Deleted
    Quote Originally Posted by Evildeffy View Post
    I've come across plenty including full blown i7-4790K and 6700Ks using their iGPU (and earlier gens of course)... doing nothing but basic office work of Word, Excel and Outlook.
    Highly infuriating and wasteful (little bit of envy) but it's the choice of the company.

    But no.. nowadays it's not cheaper to have it separate, only if the hardware they are using is ancient and they are wanting to extend it's lifetime.
    This is where my statement refers to, varying massively, I've yet to see a office have anything sandybridge or above, you need to ask your self, cheaper for who, the manufacturers to produce or the offices to buy?

    Offices just do not throw away hardware, they will save money where they can, if buying a new system with a Igpu is cheaper then so be it, if adding a low end discrete CPU to their system is the cheaper option, they will take that, sometimes offices need to start using additional IO or go HDMI because they wanted to use newer monitors which does make a bigger difference to office productivity then a new CPU.

    If a working environment needs more powerful hardware and requires core count, then the X99 platform doesn't have a IGPU, and we have seen that Intel can infact separate IGPU from the cores, Broadwell platform shows this, I would dare say the gpu on that is more tightly integrated then kabylake, mean that CPU has a level 4 cache just for the GPU.

  16. #1336
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thorianrage View Post
    This is where my statement refers to, varying massively, I've yet to see a office have anything sandybridge or above, you need to ask your self, cheaper for who, the manufacturers to produce or the offices to buy?

    Offices just do not throw away hardware, they will save money where they can, if buying a new system with a Igpu is cheaper then so be it, if adding a low end discrete CPU to their system is the cheaper option, they will take that, sometimes offices need to start using additional IO or go HDMI because they wanted to use newer monitors which does make a bigger difference to office productivity then a new CPU.

    If a working environment needs more powerful hardware and requires core count, then the X99 platform doesn't have a IGPU, and we have seen that Intel can infact separate IGPU from the cores, Broadwell platform shows this, I would dare say the gpu on that is more tightly integrated then kabylake, mean that CPU has a level 4 cache just for the GPU.
    Yeah but again we come to the point of earlier conversation... the majority do not use X99.

    The GPU is tightly integraded into the process as they only produce 1 die, whether we feel it's useless or not on K models, it's there because of the 1 die production line.
    It is useful in office variants there and whilst it's indeed overkill... it's still done.

  17. #1337
    Deleted
    Quote Originally Posted by Evildeffy View Post
    Yeah but again we come to the point of earlier conversation... the majority do not use X99.

    The GPU is tightly integraded into the process as they only produce 1 die, whether we feel it's useless or not on K models, it's there because of the 1 die production line.
    It is useful in office variants there and whilst it's indeed overkill... it's still done.
    I have to disagree with the tightly integrated aspect, the cores on the Broadwell E and desktop are not different, one has more cores, and the other has less cores but has a level 4 cache and a Igpu, Intel has shown they can develop the cpus like this.

    Also you haven't distinguished cheaper on the office or the production line, something maybe cheaper to produce but doesn't mean its reflected in the retail price, offices want cheap as possible when it comes to non specialised software, they will strive to find what ever is cheapest for its job weather it has a IGPU or not.

    Also I never said the majority used X99, its those that need the power really, and Ryzen is pitted against the X99 platform right now, a platform which doesn't have a IGPU, so its a very close comparison in this sense and the IGPU argument can be thrown out here.

  18. #1338
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,780
    Quote Originally Posted by Thorianrage View Post
    Errrr, in a professional workspace, they would be using the X99 platform, which doesn't have a IGPU....

    You also realise AMD also do APUs which is aimed at the segment you are referring to right? Basic office work can run on AMD or Intel just fine, the GPU on the K series is pointless, and where is your evidence that the GPU is tightly integrated considering the X99 platform in its entirety doesn't have a IGPU even for the likes of Broadwell - E which the broadwell desktop had additional level 4 cache for the GPU on the chip.
    No, "professional" is a very very broad definition and X99 is certainly not widespread at all.

    Running plain old quad core I7s in high tech development companies is a standard for developers, as it offers more than enough juice for everything they need and if they need anything more than that - then a dedicated server is used for such tasks.

    Non-K I7s are extremely popular for these things as they are both powerful, reliable and budget-friendly, same goes for I5s as well - even more so.

  19. #1339
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Thorianrage View Post
    I have to disagree with the tightly integrated aspect, the cores on the Broadwell E and desktop are not different, one has more cores, and the other has less cores but has a level 4 cache and a Igpu, Intel has shown they can develop the cpus like this.
    They can but it's different markets.
    As I said it's cheaper to have 1 mold that fits all, in essence a singular die with things disabled/cut for cheaper models vs. a different die for every different CPU model.
    For the consumer the iGPU is tightly integrated and costs of half the entire die size of the LGA115X socket.
    LGA2011 is an entirely different audience and are quite different in die build-up, the principle the same but different development and validation.
    Adding a GPU to a die that large would be problematic to fit in the LGA2011 socket as well.

    You're missing something important here though:
    A Haswell-E CPU will have 8 cores present of which 2 are disabled or not depending on model.
    A Broadwell-E CPU will have 10 cores present of which up to 4 cores are disabled or not depending on model.
    A Skylake/Kaby Lake CPU will have 4 cores present of which 2 are disabled or not along with HyperThreading (no physical difference) depending on model but will ALWAYS contain a GPU that will be used in that generation of consumer line CPUs.

    Because the vast majority of offices use cheap CPUs of the Sandy Bridge -> Kaby Lake line being Pentium or i3, sometimes i5 for Excel accounting systems (don't kid yourself here, Excel is used in big enterprises as actual accounting software.. it is more capable than most people know) it is cheaper to build 1 mould to build the entire line in and disable what is broken to sell as a cheaper lower model rather than wasting time and money in developing a separate die without a GPU that has a different BIOS and requires separate validation.

    This is why the current Ryzen 7 and Ryzen 5 will be the exact same identical dies and why Ryzen 5 could be introduced so quickly after Ryzen 7.

    uArch development is not easy on this lithography and lower and it costs a lot of money and time to create a new one.
    It really isn't as easy and definitely not correct to say "the cores on Broadwell-E and consumer line of CPUs are not different".

    What they CAN and CANNOT develop are 2 separate things, but they won't do so for the consumer line because the gamer are a vast minority vs. office users being a vast majority, they will use the iGPU where gamers will not.
    It's easier and cheaper to make a CPU that has both that will serve both markets and do everything from 1 template then it is to make multiple templates.
    It's hardware .. not software, 1 small mistake and it'll be EXTREMELY costly.

    Quote Originally Posted by Thorianrage View Post
    Also you haven't distinguished cheaper on the office or the production line, something maybe cheaper to produce but doesn't mean its reflected in the retail price, offices want cheap as possible when it comes to non specialised software, they will strive to find what ever is cheapest for its job weather it has a IGPU or not.
    Both, in production definitely right now for Intel (because they would have to invest in a separate production line) and for offices it depends on the situation.
    If the office has a C2D E8400 sytem they don't want to get rid of but the GPU is dead it's as simple as replacing the GPU and move on.
    A new system however is cheaper from the get-go without an added graphics card and running on the iGPU.
    Broadwell (not Broadwell-E) has a more powerful GPU than Kaby Lake but it was too expensive and difficult to produce and it's existence was short lived.

    Again: If Intel were to create a GPU-less consumer model of CPUs they would have to re-design the model, prototype it, validate it and develop separate "firmware" for it, all this is cancelled out for a unit that has all in and only requires lasercuts and MicroCode ID changes but have all the same "firmware" for it.

    This all adds cost where it's simpler to just have 1 die and just have it tested and tailored to submodels.

    Quote Originally Posted by Thorianrage View Post
    Also I never said the majority used X99, its those that need the power really, and Ryzen is pitted against the X99 platform right now, a platform which doesn't have a IGPU, so its a very close comparison in this sense and the IGPU argument can be thrown out here.
    You kind of did with the following statement:
    Quote Originally Posted by Thorianrage
    Errrr, in a professional workspace, they would be using the X99 platform, which doesn't have a IGPU....
    Regardless I feel like you're missing some context here, that discussion started from Dukenukemx's question why the consumer line of K model CPUs have an iGPU.
    The answer was given as to why, whether we feel it's irrelevant or not those are the reasons why.

    Ryzen has no iGPU and neither has Haswell-E/Broadwell-E/Skylake-X/Kaby Lake-X because they are fundamentally different designs, different designs and completely different markets than office users.

  20. #1340
    Apparently ram can be a big deal with ryzen down the road due to the interconnects. Think im gonna return my 3200 ram for some 4000 just in case

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •