Page 2 of 2 FirstFirst
1
2
  1. #21
    Herald of the Titans Saithes's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Mun
    Posts
    2,719
    Here we go...

    My Intel HD w/ QuickSync, Both GTX 460's and my i7 2600K all on max load..


  2. #22
    The efficiency of the PSU won't be very high at +90% load though.

  3. #23
    OP, I think your misinformed about power supply's. You calculate for max load. So lets calculate
    Procc:100-130w, this shouldnt need explaining..
    GFX:200-300w, (depending on model, GTX460[default clock speed] is 160w from nvidia specs.)
    RAM: 100-150ws Not sure what DDR3 draws, but SDRAM 8watts for 128mb, so 1gb will draw around 64watts
    That's over 500w without harddrives, Fans, LEDs, USB devices, PCI cards, DVD/CD.

    Also when buying good power-supplys, its not about the watts Its about the amps. GTX460 can pull up to 13amps. Most powersupply +V12 rails are 24amps. If you are doing SLI or RAID or any high-end configs you can run into problems under load.

    I can't read the graph but if its trying to say that a AMD Px6 1100T & nvidia GTX 580 uses less then 300w? That's a complete farce. I have a 1100T and a GTX460, I -easily- pull over 300 watts from the wall with a web browser open

  4. #24
    I am Murloc! Fuzzykins's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    South Korea
    Posts
    5,222
    Wattage doesn't scale linearly on RAM as memory goes up.

  5. #25
    Herald of the Titans Saithes's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Mun
    Posts
    2,719
    For the most part, RAM uses 10-12 watts per stick when the capacity is met. At least... that's what I saw when I was using 1 4GB stick and added my second 4GB Stick.

  6. #26
    Quote Originally Posted by RustyMetal View Post
    Also when buying good power-supplys, its not about the watts Its about the amps.
    I don't think you realize what you just have written. Think about it for a while.

  7. #27
    I know exactly what I wrote, I also know its a contradictory sentence from electrical engineering standpoint. In a computer hardware standpoint, it's logical; As many low-cost power supply have a high wattage rating but cannot support a high amp line or the amps are split between all lines. It's called cutting corners, and every OEM does it depdending on model/price point.

    Go to new egg, look at the $20 500watt powersupplys. Think you going to run a gaming card on that? think again. Most of these low cost models have low amp ratings on such as a 12v rail with 15amps.

    Going back to the electrical engineering, amps could be considered more important then watt rating because amps are the pushing force behind electricity. You can have all the watts in the world but it does you no good if only 200watts every seconds gets to the GPU in time when it needs 200w EVERY second.
    Last edited by RustyMetal; 2011-05-14 at 11:47 PM.

  8. #28
    Deleted
    expandability. im using a 750 watt at moment, so I know I have the headroom to add more and more parts to my machine as I need to.
    Next Purchess will proberbly be a couple of 3tb hard drives to add to the other 5 hard drives I have installed at the moment.

    also higher watt PSU's tend to come from more reputable manufacturers, have some cheapo 500watt PSU fall over and Blue screen my machine because its not putting out what it should be, is just not an acceptable thing.

  9. #29
    Quote Originally Posted by RustyMetal View Post
    I know exactly what I wrote, I also know its a contradictory sentence from electrical engineering standpoint. In a computer hardware standpoint, it's logical; As many low-cost power supply have a high wattage rating but cannot support a high amp line or the amps are split between all lines. It's called cutting corners, and every OEM does it depdending on model/price point.

    Go to new egg, look at the $20 500watt powersupplys. Think you going to run a gaming card on that? think again. Most of these low cost models have low amp ratings on such as a 12v rail with 15amps.

    Going back to the electrical engineering, amps could be considered more important then watt rating because amps are the pushing force behind electricity. You can have all the watts in the world but it does you no good if only 200watts every seconds gets to the GPU in time when it needs 200w EVERY second.
    This way it can just confuse people. Just tell them to look for the wattage on the 12V rail. If the PSU is 500W with 200W on the 12V rail - it sucks. If it's a 500W with 480W on the 12V rail then it's ok.
    There are only a few manufacturers in the market that are reliable and everyone should stick to them.
    Last edited by haxartus; 2011-05-14 at 11:57 PM.

  10. #30
    Epic!
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Hillsborough, CA
    Posts
    1,745
    Quote Originally Posted by Saithes View Post
    Yeah, Channel-Well Tech and Seasonic make all of Corsairs PSU's.
    Almost. The AX1200 is a one-off from Flextronics. And to answer Fuzzykins, Silverstone has been using Enhance a lot lately, which is just fine...The Strider Plus Silver and Gold series are great.

    The vast majority of good PSUs are OEM'd by Seasonic and CWT. Behind that it is a mish-mash of Enhance, Delta, SuperFlower, Enermax...

    And then there is Sirfa/Sirtec, Cougar/HEC, FSP, AcBel, Seventeam, Great Wall, etc...

  11. #31
    Herald of the Titans Saithes's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Mun
    Posts
    2,719
    Quote Originally Posted by kidsafe View Post
    Almost. The AX1200 is a one-off from Flextronics. And to answer Fuzzykins, Silverstone has been using Enhance a lot lately, which is just fine...The Strider Plus Silver and Gold series are great.

    The vast majority of good PSUs are OEM'd by Seasonic and CWT. Behind that it is a mish-mash of Enhance, Delta, SuperFlower, Enermax...

    And then there is Sirfa/Sirtec, Cougar/HEC, FSP, AcBel, Seventeam, Great Wall, etc...
    Ah thats true! I forgot about the AX1200 lol

  12. #32
    Quote Originally Posted by haxartus View Post
    This way it can just confuse people. Just tell them to look for the wattage on the 12V rail. If the PSU is 500W with 200W on the 12V rail - it sucks. If it's a 500W with 480W on the 12V rail then it's ok.
    There are only a few manufacturers in the market that are reliable and everyone should stick to them.
    That's exactly what happens, basically bait and switch, switch being much lower performance then expected. You make a great point, some mfc are better then others.

    Tom's Hardware just posted a great in-depth review of powersupply manufacturing and cutting costs. One of the things about powersupplys is it may have a known brand name on it but really is a shit part from low cost manufaturing.

    tomshardware.com/reviews/power-supply-oem-manufacturer,2913.html

  13. #33
    Mechagnome Auralian's Avatar
    15+ Year Old Account
    Join Date
    Aug 2008
    Location
    Chicago,Ill
    Posts
    581
    Back when Both my Wife and myself Raided 3-4 nights a week, I would see my power bill go up by $35 a month. My 35,000 BTU 30 year old chrysler AC draws $41 in the summer. So how much power do gaming rigs draw... enough to notice when paying the bills.

  14. #34
    Quote Originally Posted by Saithes View Post
    Here we go...

    My Intel HD w/ QuickSync, Both GTX 460's and my i7 2600K all on max load..

    here's a test that shows 2*GTX460 = 433W
    Not even 2* GTX580 reaches your numbers (719W)

    http://www.guru3d.com/article/geforc...-sli-review/14

  15. #35
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Op is misinformed, and alot of posters in this thread would do well to read up on electrical theory

    If all you are running is cpu and ram, like on a vm server, then you dont need a big psu, but on a average gaming computer, the cpu is only a portion of the total power drawn from the psu
    Last edited by Cyanotical; 2011-05-15 at 12:54 AM.

  16. #36
    Herald of the Titans Saithes's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Mun
    Posts
    2,719
    Quote Originally Posted by Doylez View Post
    here's a test that shows 2*GTX460 = 433W
    Not even 2* GTX580 reaches your numbers (719W)

    http://www.guru3d.com/article/geforc...-sli-review/14
    The math adds up to what I'm getting.

    180 for the CPU @5GHz
    160 per GTX 460 (320 watt total)
    50 for the Mobo
    40 for the iGPU
    25 for the RAM
    25 for my Pump
    30 for all my fans

    Comes out to 670 from the PSU and with 80% efficiency. Add another 20% for the efficiency of my PSU and you come to 804w. So my addition isn't far off. Also keep in mind those tests are only with the GPU's on load so add another 160 watts for if the CPU was on load and you'd get 600 watts. It's also not overclocked as high nor does it include 5 Hard Drives and my SSD or my Pump.

    If you're going to scrutinize someones post, at least be smart enough to calculate it up yourself instead of looking dumb.
    Last edited by Saithes; 2011-05-15 at 01:00 AM.

  17. #37
    I'm guessing the on board gpu is making up the difference. There are also alot of reviews scrutinizing multi-gpu power vs performance. The trend seems to be that 2 GPUs power consumption does not scale correctly to performance. That being, more power is taken for less performance then a single GPU. IMO multi-GPU power tests are simply relative to driver/system/settings and testing 2 differnt rigs with exact same components with different software will result in different power results.

  18. #38
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by RustyMetal View Post
    I'm guessing the on board gpu is making up the difference. There are also alot of reviews scrutinizing multi-gpu power vs performance. The trend seems to be that 2 GPUs power consumption does not scale correctly to performance. That being, more power is taken for less performance then a single GPU. IMO multi-GPU power tests are simply relative to driver/system/settings and testing 2 differnt rigs with exact same components with different software will result in different power results.
    Multi gpu setups are not as efficient in power draw v performance, but people with multi gpus don't care, the power v performance itself is really only used in server management, not gaming rigs, and even then its really only applied to virtualization

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •