Page 5 of 8 FirstFirst ...
3
4
5
6
7
... LastLast
  1. #81
    Quote Originally Posted by Skelly View Post
    Half right. Of course the number is an average. But saying "it's the average power draw that would occur in normal usage" makes it sounds like if my CPU has a TDP of 100W, I should be using an average of 100W right now, which is obviously false.

    What people mean when they say "maximum power draw" is the average power that the component would use, under maximum stress given "real world applications" (furmark, prime95 etc).

    http://www.xbitlabs.com/articles/gra...n_2.html#sect0
    The site above even goes as far as labelling TDP as "Peak Power consumption" because most of the time they're the same damn thing.

    http://www.anandtech.com/show/6774/n...nce-unveiled/2
    Above is an anandtech article on the Nvidia Titan. Now, the Titan could theoretically pull 300W from the PSU because it needs 8+6 pin connectors. The card itself has a specification of "250W TDP". This means that at full load in something like Furmark, it will pull an average of 250W from the PSU. The Titan also has a "TDP limit" of 265W. This limit has nothing to do with how much heat the coolers can dispel. It's a function of the firmware limiting how much power the card can draw from the PSU.
    Uhms in the bios it's described as board power limit or target, not tdp. Because people are using the term TDP wrong doesn't mean it's like that.

    Here copied from an Intel spreadsheet:

    "TDP is the recommended design point for the
    thermal solution’s power dissipation
    and is based on running worst-case, real-world
    applications and benchmarks at maximum component temperature. The thermal
    solution must ensure that the processor junction temperature limit is never exceeded
    under TDP conditions.
    Official intel spreadsheet starting from page 9

    It's clearly saying that the cooling must be reasonable under such loads and doesn't have anything to do with peak consumption.

  2. #82
    Quote Originally Posted by Majesticii View Post
    Least we can argue the power draw of the fx9000 is beyond rediculous.
    This is just a fx8350 on steroids.
    Indeed. Which is why the power draw is really low and obviously a part of a cherry-picking/binning process.
     

  3. #83
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by yurano View Post
    I've been using Faithh's definition where TDP is "the maximum thermal power that it would draw when running 'real applications'". As I showed with some graphs earlier, this is a reasonable assumption. 77W TDP Intel CPU pushing 120W or so at the wall. 43W consumed to PSU efficiency, motherboard consumption and GPU idling sounds fair right.
    which is really the same thing i am saying, the "real applications" part means average use

    TDP is used for buying a heatsink, ie, you don't buy a 75W heatsink for a 100W TDP CPU (not that anyone here ever comes short on the heatsink)

    Quote Originally Posted by yurano View Post

    This is the part you're getting wrong.

    First, define output.

    by output i'm putting a rather absolute view on it, i brought up the car engine earlier because there seems to be a confusion about where energy goes, in a car engine most is kinetic, in a CPU most is thermal, but not all of it, if a CPU took 100% of the electricity and converted it to heat, then you could use TDP to judge the power draw on an average day,

    but the problem is conservation of energy, if all of your electricity is being converted to heat in the process of "work" then there is nothing coming out the other end of the pipe as all of it was lost as heat, however, since we get electrical signal at the other end of the pipe we know for a fact (a very very hard fact) that not all is lost

    this plays into TDP significantly, since not all of the electrical power that runs through a CPU is converted to heat (we have electrical output going to the rest of the board to confirm) then that means that TDP will always be less than the power draw, even if it's only by 1w, it will always be less

    Quote Originally Posted by yurano View Post
    Not to be disrespectful or anything, but this really falls into the category of engineering (specifically electrical) and not straight physics.

    There's also the factor of how you presented the problem to them. What the CPU does isn't actually work in any sense, which is why I've been using "work" (with quotations) which refers to the ordering of electrons into something meaningful.
    yeah, but somebody had to bring up a "i haz degree, i is better" so why not present the question "is all electrical energy that flows through a computer processor converted to thermal energy" to people i know that have the degree that was brought up and the result was a very fast "no" from both of them

    the "basically" i got from my sisters fiance (BS in physics and engineering) was that the resistance in the materials used and gate design is that generates the heat, thermal energy generation is not the goal, but a byproduct, he said that the confusion might be coming is as the the type of conversion, 100% of converted energy is thermal, but not 100% of electricity is converted to thermal energy, not all energy is converted, but when conversion happens it's only to thermal

  4. #84
    Quote Originally Posted by Cyanotical View Post
    if a CPU took 100% of the electricity and converted it to heat, then you could use TDP to judge the power draw on an average day
    It does.

    Quote Originally Posted by Cyanotical View Post
    if all of your electricity is being converted to heat in the process of "work" then there is nothing coming out the other end of the pipe as all of it was lost as heat
    There is no real work being done by a CPU.

    Quote Originally Posted by Cyanotical View Post
    however, since we get electrical signal at the other end of the pipe we know for a fact (a very very hard fact) that not all is lost

    we have electrical output going to the rest of the board to confirm) then that means that TDP will always be less than the power draw, even if it's only by 1w, it will always be less
    This was addressed earlier. The power of electrical signals (data) is negligible. Negligible means under TDP defining circumstances, TDP ≈ electrical draw (or from an engineer's viewpoint TDP = electrical draw).

    If you're talking about current coming out of the CPU's negative leads I would point out that the voltage at this point is 0V (ground). Just because there is current doesn't mean there is power. Power requires both current AND a voltage drop (eg. across a resistor).

    Quote Originally Posted by Cyanotical View Post
    yeah, but somebody had to bring up a "i haz degree, i is better" so why not present the question
    I'm pretty sure you were the first to bring it up, albiet indirectly.

    Quote Originally Posted by Cyanotical View Post
    you may want to go back to high school physics

  5. #85
    Epic! Skelly's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Haligonia, NS, Canada
    Posts
    1,676
    Quote Originally Posted by Cyanotical View Post
    but the problem is conservation of energy, if all of your electricity is being converted to heat in the process of "work" then there is nothing coming out the other end of the pipe as all of it was lost as heat, however, since we get electrical signal at the other end of the pipe we know for a fact (a very very hard fact) that not all is lost
    I get it now. You're making the pedantic argument that even if we have electrons, even at a very very low potential, that they still have energy, and this is obviously true. This is analogous to saying that something at the deepest point in the ocean has some gravitational potential energy. No one is arguing that these aren't true but no one cares (except you), this energy isn't useful.

    Going back to this "pipe" argument. Obviously there are electrons moving along the ground wire back to the power supply, but they can't do anything. They're at ground. As far as anyone is concerned, they don't have an "energy".

    Furthermore, when we say a battery (or PSU) supplies 100W of power, you're saying that the difference between the power going out and the power coming in is 100W. This is because you just measure the difference in voltage between the input and output (which are scaleable by an arbitrary constant by the way) and multiplied by the current, which has to be the same at both points (Kirchhoff's current rule).
    i7 930 @ 4.0Ghz | Sapphire HD5970 w/ Accelero Xtreme | ASUS P6X58D Premium | 32GB Kingston DDR3-1600
    Xonar Essence STX | 128GB Vertex 4 | AX750 | Xigmatek Elysium
    Laing D5 | XSPC RX 360mm | Koolance RP-452X2 | EK-Supreme HF
    Dell 3007WFP-HC | Samsung BX2350 | Das Keyboard Model S Ultimate | Razer Naga Molten | Sennheiser HD650

  6. #86
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  7. #87
    Yup, an article is up at Sweclockers now about it as well: http://www.sweclockers.com/nyhet/171...-klockfrekvens

    See guys, you can always trust Sweclockers!

  8. #88
    Hmm, it only has a single 8pin CPU power connector. Would maybe have expected more than that given the potential power draw?

  9. #89
    Quote Originally Posted by Butler Log View Post
    Hmm, it only has a single 8pin CPU power connector. Would maybe have expected more than that given the potential power draw?
    Not really. I feel like I'm repeating myself here, but 220w TDP value is less than can be expected from an overclocked FX-8350 and a single 4+4 should be enough for most overclocking.
    So 220w TDP is really on the low side.
     

  10. #90
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by tetrisGOAT View Post
    So 220w TDP is really on the low side.
    Compared to what, because i saw your sarcasm earlier but i wasn't sure what you were basing that on or why. A stock Intel i7-3770K will not be far behind (if even behind) on this 5ghz 220W processor, which has a TDP of 77W. So it might be low for AMD's standards, but energy consumption vs performance wise it's really high.

    http://www.techspot.com/articles-inf...nch/CPU_01.png
    http://www.techspot.com/articles-inf...nch/CPU_02.png

    http://www.techspot.com/articles-inf...nch/CPU_02.png
    http://www.techspot.com/articles-inf...nch/CPU_03.png

    http://www.techspot.com/articles-info/532/bench/AMD.png
    http://www.techspot.com/articles-inf...ench/Intel.png

    http://www.techspot.com/articles-inf...nch/CPU_03.png
    http://www.techspot.com/articles-inf...nch/CPU_02.png
    Last edited by Majesticii; 2013-06-04 at 06:00 PM.

  11. #91
    Quote Originally Posted by Majesticii View Post
    Compared to what, because i saw your sarcasm earlier but i wasn't sure what you were basing that on or why. A stock Intel i7-3770K will not be far behind (if even behind) on this 5ghz 220W processor, which has a TDP of 77W. So it might be low for AMD's standards, but energy consumption vs performance wise it's really high.
    I'm not sure how you expect an overclocked CPU to retain its TDP-value. An overclocked IB increases its TDP-value while overclocked by a substantial margin. It's only 200w vs 77w TDP if both are at stock, and then it's pretty clear that while IB wins in performance per watt, AMD wins by absolute numbers.

    And those graphs you provided shows that AMD are probably much better than I even thought in games, thanks. Especially considering that the FX-CPUs aren't overclocked as far as the Ivy Bridge ones are.
     

  12. #92
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by tetrisGOAT View Post
    I'm not sure how you expect an overclocked CPU to retain its TDP-value. An overclocked IB increases its TDP-value while overclocked by a substantial margin. It's only 200w vs 77w TDP if both are at stock, and then it's pretty clear that while IB wins in performance per watt, AMD wins by absolute numbers.

    And those graphs you provided shows that AMD are probably much better than I even thought in games, thanks. Especially considering that the FX-CPUs aren't overclocked as far as the Ivy Bridge ones are.
    I usually regard you knowledgeable, but either we're not watching the same graphs here, or you're kinda missing the point. 3/4 of those games the i7-3770K has a higher framerate at 3.5ghz (stock) than the fx8350 has on 4.5ghz. The extra 500 isn't going to change that.
    And it's not overclocked if it's released as a new processor. In theory it might be, but it should not be regarded as such.

    It says so in the term, OVERclocked. As in beyond release specification.

  13. #93
    Quote Originally Posted by Majesticii View Post
    I usually regard you knowledgeable, but either we're not watching the same graphs here, or you're kinda missing the point here. 3/4 of those games the i7-3770K has a higher framerate at 3.5ghz (stock) than the fx8350 has on 4.5ghz. The extra 500 isn't going to change that.
    And it's not overclocked if it's released as a new processor. In theory it might be, but it should not be regarded as such.
    Not in Crysis 3 (0/1)
    In Metro last light (1/2)
    Not present in Dibalo III, but the FX-8150 (a worse CPU) equals it (1/3)
    Equal in Far Cry 3 (1/4)

    In other words, I strongly disagree and call the FX-8350 a very good gaming CPU for the money

    The FX-9000 is stock yes. Which I did explicitly say
    "It's only 200w vs 77w TDP if both are at stock, and then it's pretty clear that while IB wins in performance per watt, AMD wins by absolute numbers." (although I mis-wrote it as 200)
    Last edited by BicycleMafioso; 2013-06-04 at 06:04 PM.
     

  14. #94

  15. #95
    Bloodsail Admiral Killora's Avatar
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    BFE, Montana
    Posts
    1,105
    Quote Originally Posted by tetrisGOAT View Post
    "It's only 200w vs 77w TDP if both are at stock, and then it's pretty clear that while IB wins in performance per watt, AMD wins by absolute numbers." (although I mis-wrote it as 200)
    Since when does the AMD chip ever win by absolute numbers? It's either matching or inferior. in more cases than not, it's the latter.

  16. #96
    The 8350, Piledriver, has much better IPC than 8150, Bulldozer.
    That's like saying Haswell is equal to Sandy Bridge, just higher clockspeeds.

    Either way, the 8350 @ 4.50 is better than 3770K (min FPS > max fps)
    It wins again in the BlOps2 bench.

    So I fail to see your point, no offense. The 8320/8350 is a better CPU for the money in gaming than an i7 3770K. Not costing exactly half, but considering that motherboards are cheaper as well on the AMD side..
    My point remains. Outside of WoW (where I don't think it's a problem either), AMD is a better recommendation for the money than Intel on a budget.
    The flak it gets is immensely exaggerated. Bulldozer wasn't great, no. But Piledriver isn't that bad. It can compete (but not win) against Intel really well in something it's not designed to do well in, ie high IPC-tasks. :P

    ---------- Post added 2013-06-04 at 08:23 PM ----------

    Quote Originally Posted by Killora View Post
    Since when does the AMD chip ever win by absolute numbers? It's either matching or inferior. in more cases than not, it's the latter.
    So you would say that a 5.00 GHz FX-9000 is not outperforming a 3.50 GHz i7 3770K in anything? Really?
     

  17. #97
    Bloodsail Admiral Killora's Avatar
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    BFE, Montana
    Posts
    1,105
    Quote Originally Posted by tetrisGOAT View Post
    So you would say that a 5.00 GHz FX-9000 is not outperforming a 3.50 GHz i7 3770K in anything? Really?
    Given this is a gaming forum i assume we're on the topic of gaming. Which, yes, i would say a 5ghz FX-9000, assuming the same architecture as piledriver, would not outperform the 3770k in a lot of games. The games where it matters. In games like BF3? perhaps. But given how GPU intensive those games are the chances that it would even be an issue is slim to none.


    P.S. The 3570k is the same price as the 8350, and has the same gaming performance as a 3770k. So that argument is rather invalid.
    Last edited by Killora; 2013-06-04 at 06:33 PM.

  18. #98
    Quote Originally Posted by tetrisGOAT View Post
    My point remains. Outside of WoW (where I don't think it's a problem either), AMD is a better recommendation for the money than Intel on a budget.
    It's still a WoW fansite and even i3 outperforms FX-8350 by a mile in WoW at lower cost unless you significantly overclock that AMD.
    Never going to log into this garbage forum again as long as calling obvious troll obvious troll is the easiest way to get banned.
    Trolling should be.

  19. #99
    Quote Originally Posted by vesseblah View Post
    It's still a WoW fansite and even i3 outperforms FX-8350 by a mile in WoW at lower cost unless you significantly overclock that AMD.
    I see you throw that around a lot. I see increasingly fewer people request Wow-rigs, albeit still many, yet I don't say anything against that.
    This forum and many of the other sub-forums, however, have grown beyond just WoW however. As is proper. Times evolve. Neither is MMOC inherently a WoW-site - that is an effect of WoW being the only real long-term serious contender in MMOs. Just throwing it out there.
    And the i3 does out-perform the 8350 in Wow. Not by such a large margin as you seem to suggest, but that it does.
     

  20. #100
    Bloodsail Admiral Killora's Avatar
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    BFE, Montana
    Posts
    1,105
    Irrelevant to the fact that this is a WoW fansite or not, WoW isn't the only game the 8350 (or any AMD CPU for that matter) falls behind of intel. In pretty much every MMO performance is noticeably worse on a AMD CPU as opposed to an equal pricepoint Intel. Borderlands 2, skyrim, etc. a fair bit of popular games run worse on AMD CPU's. Now, most FPS games the difference is not overly noticeable, provided you don't have a 120/144hz screen, and in BF3 the 8350 is actually equal to the 3570k. But honestly, i see no appeal to buying a CPU that only performs the same in some games, and worse in the rest. Not to mention, i'd argue the 8350 is actually more expensive, as the cheapest motherboard you could safely put one on is..$120? You can get a $80 board for a 3570k that will get it to 4.5ghz fine.

    The concept that AMD is cheaper than intel hasn't existed for awhile. It's pretty much only at the ultra low end of the spectrum.

    i mean, don't get me wrong, i'm not saying the 8350 is a trash CPU, but i simply see no compelling reason to ever buy one for gaming. even if you only play games that it performs the same in as a equally priced intel. You never know what games you'll want to play or what it'll be like in the future. You said it yourself it's not designed for gaming, and that should be indicative enough..
    Last edited by Killora; 2013-06-04 at 06:54 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •