Page 19 of 55 FirstFirst ...
9
17
18
19
20
21
29
... LastLast
  1. #361
    To be fair, photoshopping that kind of picture wouldn't exactly be hard... so I'll be a tad wary of screenshots.

  2. #362
    Yeah, that could have been "photoshopped" in MS Paint.

  3. #363
    Deleted
    Apparently it is a possibility, if they use a "clamshell" design. But I am wary of anything at the moment regarding the 3xx series and the Fury.

    Just 1-2 more weeks

  4. #364
    Deleted
    Quote Originally Posted by Remilia View Post
    I'm guessing the Tonga ones, but I don't know, leaked pics have it at 1x8 pin with semi-passive cooling from Gigabyte.

    All over the place, wheeeeeee.
    Which is why I mentioned the 'apparently,' because if that's true, then it's max 225W TDP (75 from PCIe3, 150 from 8-pin), which is still considerably better than the 290.

  5. #365
    Oh... Please no.

    http://www.hardwareluxx.com/index.ph...tx-980-ti.html

    So... Apparently the Fiji XT is slower than the 980 Ti. Internally still at the $850 price bracket, though apparently they may be forced to change that because of the 980Ti's pricing. It's also pretty much confirmed that it'll only have 4gb of HBM and there is apparently a cut down version of it coming too.

    Not sure what to think now. At least it's still called Fury.

  6. #366
    The Lightbringer Evildeffy's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Nieuwegein, Netherlands
    Posts
    3,772
    Quote Originally Posted by Shinzai View Post
    Oh... Please no.

    http://www.hardwareluxx.com/index.ph...tx-980-ti.html

    So... Apparently the Fiji XT is slower than the 980 Ti. Internally still at the $850 price bracket, though apparently they may be forced to change that because of the 980Ti's pricing. It's also pretty much confirmed that it'll only have 4gb of HBM and there is apparently a cut down version of it coming too.

    Not sure what to think now. At least it's still called Fury.
    If that's true ... then that sucks for competition and prices.

    However I still recommend we wait and see what we get at from reviewers.

    If true however I know my next upgrade will be a Titan X... if I manage to land a fking job somewhere that doesn't suck.

  7. #367
    Deleted
    Quote Originally Posted by Shinzai View Post
    Oh... Please no.

    http://www.hardwareluxx.com/index.ph...tx-980-ti.html

    So... Apparently the Fiji XT is slower than the 980 Ti. Internally still at the $850 price bracket, though apparently they may be forced to change that because of the 980Ti's pricing. It's also pretty much confirmed that it'll only have 4gb of HBM and there is apparently a cut down version of it coming too.

    Not sure what to think now. At least it's still called Fury.
    From the article:

    Clock speeds of GPU and memory were sadly not betrayed. The cards also can't run in their current form, as no BIOS is present. It can therefore only be switched on, but no image will appear on a screen.
    AMD internally calls the card Fury X, yet it remains to be seen if it will hit the market under the name, too.
    The partner did hint at the performance. Apparently the Radeon Fury X ought to be slower than the GeForce GTX 980 Ti.
    So let's not jump to conclusions based on a leak with missing detailed spec and supposedly worse performance at a higher price point. The card wasn't even able to run...

  8. #368
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    AMD has release info about multi GPU setups and DX12.

    http://wccftech.com/amd-sheds-more-l...in-new-slides/

  9. #369
    Quote Originally Posted by Dukenukemx View Post
    AMD has release info about multi GPU setups and DX12.

    http://wccftech.com/amd-sheds-more-l...in-new-slides/
    So they're paper launching a bandage?
    (advertising Crossfire with APU as something new and groundbreaking in DX12 except that it has been possible for years already in DX11)

  10. #370
    Quote Originally Posted by fixx View Post
    So they're paper launching a bandage?
    (advertising Crossfire with APU as something new and groundbreaking in DX12 except that it has been possible for years already in DX11)
    Except it didn't work with an Intel CPU. In DX12 you can pair an dGPU from AMD/NVidia with the iGPU on a Intel CPU.

  11. #371
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Intel just released the Core i7-5775C And i5-5675C with Iris Pro 6200 graphics. And it's kicking AMD's APU's to the side, though at 3x the cost. Looks like Nvidia and AMD have more competition.

  12. #372
    Quote Originally Posted by Dukenukemx View Post
    Intel just released the Core i7-5775C And i5-5675C with Iris Pro 6200 graphics. And it's kicking AMD's APU's to the side, though at 3x the cost. Looks like Nvidia and AMD have more competition.
    Not really. The advantage it has is 14nm when all desktop AMD APU's are 28nm. AMD is expecting that their 400 series next year is going to double performance per watt, that's how much of a difference the jump is going to make. Intel is not interested in the dGPU market. Also, how big is that GPU compared to an AMD GPU on an APU? What's the performance per size ratio? It wouldn't be much if it manages that much more performance but needs double the transistor count.

    There is a lot to consider here.

  13. #373
    Deleted
    Quote Originally Posted by Dukenukemx View Post
    Intel just released the Core i7-5775C And i5-5675C with Iris Pro 6200 graphics. And it's kicking AMD's APU's to the side, though at 3x the cost. Looks like Nvidia and AMD have more competition.
    ...well, shit. That's actually pretty interesting. Now, if Carrizo lives up to the hype behind it...well, things could be good for budget (sub-$500) gaming rigs.

  14. #374
    Quote Originally Posted by Noctifer616 View Post
    Not really. The advantage it has is 14nm when all desktop AMD APU's are 28nm.
    Thanks for pointing out the state of things, because that *is* the state of things and anyone in the market for an APU right now has those options. They will compete on this next year(maybe)? Sounds like nearly as good a window as any hardware has ever had so I am not sure why you are trying to downplay it.

  15. #375
    Quote Originally Posted by Afrospinach View Post
    Thanks for pointing out the state of things, because that *is* the state of things and anyone in the market for an APU right now has those options. They will compete on this next year(maybe)? Sounds like nearly as good a window as any hardware has ever had so I am not sure why you are trying to downplay it.
    It's not an impressive GPU, that's all I am saying.

  16. #376
    Quote Originally Posted by Noctifer616 View Post
    It's not an impressive GPU, that's all I am saying.
    Intel's not interested in selling discrete graphics cards. They are aiming for office computers, laptops and mobile markets with integrated cpu+gpu package and in that segment it's pretty impressive.

  17. #377
    Quote Originally Posted by fixx View Post
    Intel's not interested in selling discrete graphics cards. They are aiming for office computers, laptops and mobile markets with integrated cpu+gpu package and in that segment it's pretty impressive.
    Not really. You want an office compute for regular stuff? Almost anything will do. You want something for apps like Maya, Photoshop etc, you get a workstation grade GPU or a dGPU with some serious compute performance.

    They might be really good for gaming if devs actually really exploit the multi-adapter functionality, but we have to see how that will do.

  18. #378
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Noctifer616 View Post
    Not really. The advantage it has is 14nm when all desktop AMD APU's are 28nm. AMD is expecting that their 400 series next year is going to double performance per watt, that's how much of a difference the jump is going to make.
    Iris Pro graphics have been around for some time and Intel just joined the desktop market with it. The only real difference between Iris Pro and HD 4600 is eDRAM. Something AMD has been doing with consoles since the Xbox 360 but yet absent on desktops and laptops. With AMD's HBM memory, I don't see a reason for them not to do the same for APU's. But by doing so AMD and Intel would destroy the sub $200 graphics card market. Especially with DX12 allowing multiple graphic cards of different vendors to work together, this could be a game changer.

    Intel is not interested in the dGPU market.
    Intel for a while was working on the larrabee project, so they had some interest in graphics. Iris Pro graphics was actually forced onto them by Apple cause they were the ones who wanted it.
    Also, how big is that GPU compared to an AMD GPU on an APU? What's the performance per size ratio? It wouldn't be much if it manages that much more performance but needs double the transistor count.
    Like I said the big difference was the 128MB eDRAM. Otherwise same GPU as the HD 4600. They literally just put the eDRAM as a separate chip next to the CPU. I'm sure these new chips from Intel have something better than HD 4600 but the old Iris Pro's were just HD 4600's with eDRAM. AMD has spoken of doing something similar to eDRAM but with HBM I would figure they'd forget that idea and go straight for the HBM memory.

    According to the benchmarks it's nearly 100% faster than AMD's 7850k in some benchmarks. AMD Carrizo is too far away for the desktop market so this is going to hurt AMD again. But the the price of this chips is over $100 more than AMD's 7850k. Nothing for AMD to worry about now but if Intel made it standard to offer Iris Pro, then AMD and Nvidia would lose market. With DX12 there's no reason a gamer wouldn't want one of these.

    Quote Originally Posted by Noctifer616 View Post
    It's not an impressive GPU, that's all I am saying.
    Not for the price but it shows Intel can do graphics. But when DX12 and Vulkan gets released this maybe the "go to" chip for those who want maximum FPS.
    Last edited by Vash The Stampede; 2015-06-02 at 06:17 PM.

  19. #379
    The Lightbringer
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    Europe
    Posts
    3,745
    Yeah, checking the prices of the 980 TIs. Ehem.
    Can we stop having Nvidia pushing the prices every fucking year? Sure there's inflation in the world, however the price increases? Jeez. What's next?
    AMD better take care of this shit and push the prices down.

  20. #380
    Quote Originally Posted by Kezotar View Post
    Yeah, checking the prices of the 980 TIs. Ehem.
    Can we stop having Nvidia pushing the prices every fucking year? Sure there's inflation in the world, however the price increases? Jeez. What's next?
    AMD better take care of this shit and push the prices down.
    When the GTX 590 released it cost less than 600 dollars. Just saying.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •