Page 3 of 3 FirstFirst
1
2
3
  1. #41
    OP: go for the graphics card. Your computer is good otherwise.

    Everyone else: start your own thread or quiet down already.

  2. #42
    Quote Originally Posted by Artorius View Post
    Nvidia can very well brute force through it even if they don't have proper support.
    There is no way to have enough power to brute force serial computing enough to outperform asynchronous computing in regards to graphics cards. AMD might trail Nvidia a lot of the time in raw power, but not enough that Nvidia would beat it in DX12 when comparing 2 comparable cards.

  3. #43
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Gorgodeus View Post
    There is no way to have enough power to brute force serial computing enough to outperform asynchronous computing in regards to graphics cards. AMD might trail Nvidia a lot of the time in raw power, but not enough that Nvidia would beat it in DX12 when comparing 2 comparable cards.
    That was exactly my point, if they could brute force through it and have comparable performance it would be an ok design choice. But they can't, AMD is not sleeping.
    Judging by previous architectures GCN4 will most likely be on par or stronger than Pascal and then you still have the Asyc compute advantage to make things worse.

  4. #44
    Quote Originally Posted by Butthurt Beluga View Post
    ... this thread is waaay off topic but, there's really no reason to upgrade your whole computer right now.
    Intel are a bunch of lazy hacks and haven't had a meaningful CPU upgrade from the previous generation since Sandy Bridge.
    Five years later and we're still stuck on quad-core chips where 1/3 of the die is a fuckin' iGPU.

    I would wait for not only GTX 1070/1080 real world benchmarks but also for Polaris10/11 even though they're intended for 'mainstream' market (so will probably cost no more than $300 USD)
    Not really Intel's fault. There just is no need atm for anything beyond quad cores for the vast majority of PC users. So why waste resources developing something very few need or will be willing to pay for?

  5. #45
    Quote Originally Posted by Butthurt Beluga View Post
    So you're saying that it's not Intel's fault their products have stagnated in terms of performance for half a decade?
    Yeah, it sort of is - and if they would actually release a compelling product then maybe I'd buy one.

    I understand why Intel has pretty much dumped the R&D budget for their desktop CPUs, because they have no competitors.
    Their only competition is a company 100x smaller than their own in AMD - and even half of AMD at that because it's split between their CPU division and RTG (Radeon Technologies Group) their GPU division.
    Intel doesn't care about the desktop market, they're trying as hard as they can to break into the mobile market with low power/small die chips, but are failing miserably in that regard.
    Yet you keep ignoring the simple fact that there is no need for a supercomputer on 99.9% of pc users desks. Intel has been routinely releasing quad cores that are 10-20% improvements over their predecessors. I mean why would any business in their right mind develop and try to market something there is no market for?

    Intel has made very powerful chips. Hell, in 2010 they released a 32 core with 4 threads per core Sandy Bridge based chip.

    Fact of the matter is, the vast majority of PC users out there do not need anything more powerful than what is available.

  6. #46
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Gorgodeus View Post
    Not really Intel's fault. There just is no need atm for anything beyond quad cores for the vast majority of PC users. So why waste resources developing something very few need or will be willing to pay for?
    Intel can make bigger quad-cores. They can make a quad-core that performs way better than their current offerings but this isn't more profitable than making more chips per wafer and selling them at a high price. If they had to push performance they would, but they don't need to.

    More than 50% of Skylake's die is used for the GPU, they're really not even taking the consumer CPU game serious anymore because they don't have any competition from AMD.

  7. #47
    The Patient Kufell's Avatar
    10+ Year Old Account
    Join Date
    Jan 2010
    Location
    United Kingdom
    Posts
    239
    Your computer is pretty much identical to mine, difference being you have more memory, and I used to have the HD 7950 rather than the 7970. A year ago I upgraded to the GTX 980 and haven't had any issues performance wise, the 3770k is getting on in age, but is still a decent contender. So you could probably get away with lengthening your computers life span with a GPU upgrade.

    You will likely find then though that once you upgrade your graphics card that that's as far as you can go upgrade wise without replacing the Motherboard, as that's where I'm at currently, thus I personally do plan on getting a new mobo and a i7 6770k. So it probably boils down to that for you, if you want to upgrade GPU for the time being then replace motherboard/CPU/RAM later or if you'd rather just fork out the cash for a complete new rig.
    Last edited by Kufell; 2016-05-11 at 03:12 AM.

  8. #48
    Quote Originally Posted by Artorius View Post
    Intel can make bigger quad-cores. They can make a quad-core that performs way better than their current offerings but this isn't more profitable than making more chips per wafer and selling them at a high price. If they had to push performance they would, but they don't need to.

    More than 50% of Skylake's die is used for the GPU, they're really not even taking the consumer CPU game serious anymore because they don't have any competition from AMD.
    I think it has more to do with the declining numbers of PC users and the fact that PCs last longer now than before because the applications used on them do not require the same level of upgrades from year to year as before. Fewer PCs were sold in 2015 than were sold in 2008. This why you see both AMD and Intel transitioning away from PCs and towards the mobile and server segments.

  9. #49
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Gorgodeus View Post
    I think it has more to do with the declining numbers of PC users and the fact that PCs last longer now than before because the applications used on them do not require the same level of upgrades from year to year as before. Fewer PCs were sold in 2015 than were sold in 2008. This why you see both AMD and Intel transitioning away from PCs and towards the mobile and server segments.
    Things can always get faster, it surely gets to a point where the same % difference won't impact your real world performance anymore but they make a difference whenever your CPU actually has to work (compressing/decompressing things or running some crazy JS and so on). But it really doesn't matter, they can sell those CPUs with 5% improvement over the previous gen because nobody is challenging them and their market share.

    They keep making the die smaller and smaller while selling them for the same price or more expensive, Intel basically increases their profits each gen. Could they make a 165W chip of the size of the 5960X with 4 bigger cores that perform like true champs? Yes, with low-yields.

    But what you said is also true, the mobile market is way more profitable than the desktop one nowadays. People are buying more laptops than desktops and this is undeniable, performance/watt has never been more important.

  10. #50
    Quote Originally Posted by Lemons View Post
    Alright...I guess I will just get the card. I don't really care for AMD cards though...I'm more of an Nvidia guy, should I still wait in that case?
    "im more of an NVDIA guy, dont like AMD".... has an AMD gpu currently

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •