Page 64 of 77 FirstFirst ...
14
54
62
63
64
65
66
74
... LastLast
  1. #1261
    Deleted
    One thing stands out, a lot of the reviews screaming about the RX480 going way over its power usage, are also running games at 4k.
    The 480 wasn`t made for 4k was it?

    Anyhoe, Reddit thread about the power was a good read.

  2. #1262
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by coprax View Post
    Only 6GB?...but why?
    I'm guessing cutdown GP104@1280CC. 1/2 of GP104 with one GPC removed. I imagine Nvidia doesn't want another 970 memory disaster so they go with 6GB only instead of 6+2GB kind of thing. Cutting down a GPC (SM cluster) also means it removes 16 ROPs, and since ROPs for Nvidia is directly connected to memory bandwidth that means 64bit of memory bus is removed, which in turns mean 192bit rate instead of 256bit.

  3. #1263
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Svinoi Banana View Post
    One thing stands out, a lot of the reviews screaming about the RX480 going way over its power usage, are also running games at 4k.
    The 480 wasn`t made for 4k was it?

    Anyhoe, Reddit thread about the power was a good read.
    Shouldn't matter if you run the game at 4k. Yes it'll draw more power, but the question is will it break something? The answer is no. If you overclock your graphics card, you're pulling more power than the 75w spec allows. And really, it's over that spec by 2w - 5w. That's within error range.

    Quote Originally Posted by Remilia View Post
    I'm guessing cutdown GP104@1280CC. 1/2 of GP104 with one GPC removed. I imagine Nvidia doesn't want another 970 memory disaster so they go with 6GB only instead of 6+2GB kind of thing. Cutting down a GPC (SM cluster) also means it removes 16 ROPs, and since ROPs for Nvidia is directly connected to memory bandwidth that means 64bit of memory bus is removed, which in turns mean 192bit rate instead of 256bit.
    It has to do with the amount of memory chips. The RX 480 has eight 32-bit memory chips, which combine into 256-bit. If eight of those chips is 1GB, then that adds to 8GB. The rumor says the 1060 is 192-bit, which makes sense. Then the 1060's would have six 32-bit 1GB chips, which would add up to 192-bit. The 6GB cards use 1GB chips, while the 3GB ones use 512MB.

    Not sure how many people would want to buy a 3GB 1060 for $250 with +15% more performance. At GTX 980 performance levels you really need that 4GB memory.


  4. #1264
    Quote Originally Posted by Dukenukemx View Post
    That's why I want to see some testing done. Some say the RX 480 pulls 2W more over the PCIE than it should, which is probably within spec. Some say it pulls way more. It's a concern, but most likely people are experiencing issues cause their machines weren't in great condition to begin with and they failed cause you agitated an already weak system. And what kind of system is getting a $200 graphics card? Probably old pieces of crap.

    There needs to be more testing, and with other cards as well. For all I know, it's Nvidia's way to hurt sales until the 1060 is released.

    - - - Updated - - -

    AdoredTV had a take on this subject, and it doesn't look like the RX 480 is alone. He also mentioned how useless the 8GB of memory is on this card. Compared to a lot of other cards, the RX 480 is doing pretty well on PCIE power draw.

    I wonder if the tests were done on cheaply made Motherboards, or the decent mid-range ones, gaming / non gaming, or the expensive high-end boards?
    I'm a Kitsune! Not a cat, or a mutt!

  5. #1265
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Dukenukemx View Post
    It has to do with the amount of memory chips. The RX 480 has eight 32-bit memory chips, which combine into 256-bit. If eight of those chips is 1GB, then that adds to 8GB. The rumor says the 1060 is 192-bit, which makes sense. Then the 1060's would have six 32-bit 1GB chips, which would add up to 192-bit. The 6GB cards use 1GB chips, while the 3GB ones use 512MB.
    Not sure how many people would want to buy a 3GB 1060 for $250 with +15% more performance. At GTX 980 performance levels you really need that 4GB memory.
    [IMG]http://cdn.wccftech.com/wp-content/uploads/2016/06/AMD-Radeon-RX-480-PCB-Polaris-10-10.jpg[IMG]
    I wasn't talking about the bus width per chip. I'm very well aware of that. It's the reasoning behind 192bit. Nvidia's buswidth is directly correlated to ROPs. This is why the 224/256bit memory bus / 56/64 ROP was a thing with the 970. If GP106 is 3 GPCs then that'd fit 192bit, but that also means a full spec GP106 is 1920CC@48ROPs, but that doesn't fit 1280CCs. So it's really just the reasoning why it's 6GB/3GB@192Bit, is more than likely being a very heavily harvested GP104.

    - - - Updated - - -

    and to put things into perspective for the whole power draw... did anyone ever thought it was strange for a R9 295X2 was drawing 600W~ with 2x 8 pin... 2x8Pin = 300W + 75W from the PCI-E yet no one freaks out about that where as 10W over the 150W spec everyone freaks the fuck out...

  6. #1266
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Shadow Fox View Post
    I wonder if the tests were done on cheaply made Motherboards, or the decent mid-range ones, gaming / non gaming, or the expensive high-end boards?
    To give you an idea, this Genius YouTuber did a test with the RX 480 on a AM2 Foxconn motherboard. Foxconn! AM2! Two things that shouldn't be in your gaming PC.



    And this one, well his motherboard has yellow gunk on it. Looks like something was spilled? The slots in the back of the case were never closed up and stuff could have entered the case, including spilling a drink that splashed into the machine. Wouldn't be surprised if it's Mountain Dew.

    Quote Originally Posted by Remilia View Post
    and to put things into perspective for the whole power draw... did anyone ever thought it was strange for a R9 295X2 was drawing 600W~ with 2x 8 pin... 2x8Pin = 300W + 75W from the PCI-E yet no one freaks out about that where as 10W over the 150W spec everyone freaks the fuck out...
    Not even 10W, it's more like 3W. This is what everyone is freaking out over, which is 10W at 4k. Even JayzTwoCents is ignoring this, which amazes me that he's not involved in this mess.



    But most games look like this.


  7. #1267

  8. #1268
    I came across this reddit topic

    https://www.reddit.com/r/nvidia/comm...hat_are_1060s/

    Is this a common thing in GPU manufacturing? Sounds like a shit way to have a standard for a circuitry.

  9. #1269
    hopefully more games will use DX12 since this new 480 card outperforms even titans in dx12, which is amazing, NVidia dropped the ball by not having proper async compute hardware instead of using shitty software version.

  10. #1270
    Where is my chicken! moremana's Avatar
    15+ Year Old Account
    Join Date
    Dec 2008
    Location
    Florida
    Posts
    3,618
    Gamers that are on a budget do this continuously, as they have to, most games are GPU bound, so when a affordable beast comes out they buy it with the expectations of better performance until they upgrade their PCs, not everyone can afford a new upgrade every 2-3 years.

    AMD should have made sure this wouldn't be an issue, these cards should have been tested on MBs that are say 5-7 year old. There is a shit ton of people still rockin the Core2s and Phenoms, which is where most of these cards are going to land.

    I'm waiting for the AIBs to make their own before I replace my wifes GTX 950, I will say no one should ever buy reference cards, it makes no sense as performance goes up and cost go down after a month or so.

  11. #1271
    Quote Originally Posted by Kuntantee View Post
    Is this a common thing in GPU manufacturing?
    It's been the standard for a long long time.

    Not to do it is simply not an option.. Putting it simply, larger your chip size the chance of your chip containing a manufacturing error increases exponentially and with chips being fairly large nowdays, too many chips would be thrown away if we were to just use the full versions of the chip.

  12. #1272
    Where is my chicken! moremana's Avatar
    15+ Year Old Account
    Join Date
    Dec 2008
    Location
    Florida
    Posts
    3,618
    Quote Originally Posted by Kuntantee View Post
    I came across this reddit topic

    https://www.reddit.com/r/nvidia/comm...hat_are_1060s/

    Is this a common thing in GPU manufacturing? Sounds like a shit way to have a standard for a circuitry.
    Standard practice for years.

  13. #1273
    Quote Originally Posted by Dorfadin View Post
    hopefully more games will use DX12
    Well the Microsoft first party exclusives will use DX12, so we only need them not to screw them up with UWP and stuff.

  14. #1274
    link to a detailed power draw test article

    http://www.pcper.com/reviews/Graphic...-Radeon-RX-480

    Would avoid overclocking till the drivers with a fix are out.

  15. #1275
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Kuntantee View Post
    I came across this reddit topic

    https://www.reddit.com/r/nvidia/comm...hat_are_1060s/

    Is this a common thing in GPU manufacturing? Sounds like a shit way to have a standard for a circuitry.
    How you think people like me turn Radeon 9500's into 9700 Pro's? Also, the GTX 970 is a failed 980. But because people like me would take cheap graphic cards and turn them into $500+ cards with a simple software tweak, they now laser cut off the sections, because cards like the 970 are much higher in demand than the 980 and a lot of times both AMD and Nvidia will sell perfectly capable high end cards, but retard them to low end cards due to demand.

    Makes you feel good about capitalism.

  16. #1276
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,863
    Quote Originally Posted by Dukenukemx View Post
    How you think people like me turn Radeon 9500's into 9700 Pro's? Also, the GTX 970 is a failed 980. But because people like me would take cheap graphic cards and turn them into $500+ cards with a simple software tweak, they now laser cut off the sections, because cards like the 970 are much higher in demand than the 980 and a lot of times both AMD and Nvidia will sell perfectly capable high end cards, but retard them to low end cards due to demand.

    Makes you feel good about capitalism.
    a. You would not have these things if not capitalism to begin with
    b. Dropping one cluster does not turn a high end card into low end card.
    c. If not this practice, you would pay twice the price for full chip, simply because anything less than that would be thrown out making the whole production much less profitable.


    Basically, thanks to "capitalism" people have wonderful 970, even if some of those 970s could potentially be 980s.
    Last edited by Gaidax; 2016-07-02 at 02:59 PM.

  17. #1277
    Deleted
    Hmm make out of it what you will.
    I understand he used a budget board, why would you put an ROG in a Mining rig, but still.

    https://bitcointalk.org/index.php?to...55#msg15438155

    And

    https://community.amd.com/thread/202410
    Last edited by mmoc2d1a1cc4ba; 2016-07-02 at 03:10 PM.

  18. #1278
    @Dukenukemx, I don't really follow your logic, why would a company artificially lower its yield? If they want to sell cheaper and less capable chips why not have a separate cheaper process for them?

  19. #1279
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,863
    Quote Originally Posted by Kondik View Post
    Hmm make out of it what you will.
    I understand he used a budget board, why would you put an ROG in a Mining rig, but still.

    https://bitcointalk.org/index.php?to...55#msg15438155

    And

    https://community.amd.com/thread/202410
    OK that's more concerning yeah.

    One thing I wonder, how did AMD not figure this out... I understand (sort of) reasoning for them using 6 pin for RX 480 as reference, but then they should have made sure 500 times they stay in a frikkin' power limit, how did they let such shit pass?

    It's an utter failure on their part, sure they will remedy it with drivers, but seriously - getting this out of the gate is simply weak.

  20. #1280
    Quote Originally Posted by dadev View Post
    @Dukenukemx, I don't really follow your logic, why would a company artificially lower its yield? If they want to sell cheaper and less capable chips why not have a separate cheaper process for them?
    The silicon wafer costs the same, regardless which chips you make with them. Manufacturers will try to use the not-completely defective chips (as there are always partially and fully defective chips on a wafer) to minimize wastage.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •