Okay, I did a really bad job of explaining why the GTX480 is a 480W+ part and why a 250W TDP isn’t what it puts out in terms of heat on Twitter yesterday. So, let’s try it again, with the math backing it up.
First of all; standalone GPUs are worthless. Claiming a GPU’s wattage by itself is like telling me how many calories are in the pepperoni – but not the rest of the pizza. The same with putting the numbers together – I could just have the pepperoni, but then it’s not a pepperoni pizza now is it? So how are you going to play games on just a video card? Well, you aren’t. So rating just by the GPU is only useful when sizing a system, and even then the numbers usually end up heavily fudged.
So let’s take the latest part paper-launched (won’t be available to buy until April) by nVidia, the GTX480. TDP of 250W – for the GF100, not the card. Or maybe it’s the other way around? Your guess is as good as mine, but based on data I’ve seen, 250W TDP for the card is probably somewhere around 25-50W on the conservative side, estimating for heat losses. With a TJmax of 105C and typical operating temperature of over 80C at the die, you’re talking about massive efficiency loss from temperature. Heat reduces efficiency, especially in electronics, so when you’re running voltage regulators managing 250W anywhere close to the top of their rated operating temperature? You start incurring some pretty nasty losses.
HardOCP was too kind in their power tests, because they utilized a benchmark called FurMark. FurMark runs almost exclusively on GPU with very low CPU loading. This is to prevent CPU binding; but it also allows modern Intel CPUs to go into lower power states. This is NOT representative of what you’ll see while gaming at all. Games stress GPU and CPU, pushing both towards max power draw and TDP. In fact, most modern games will get a Core i7 920 pretty near 130W draw.
So, let’s call it 125W for the CPU, 35W for the motherboard, 50W for a pair of SATA disks, and 250W for the GTX480. 125+35+50+250 = 460W. This number is particularly amusing to me, as some years ago, the specially built WTX power supply for Dual Athlon MP boards with AGPPro150 slots produced exactly that number. It also makes it a 400W+ part, because even if you switch to a Socket 1156 part at 95W, you’re still over 400W. AMD? Still over 400W. There is no way to build a usable system around a GTX480 with 90% load at 400W or less. That means 80 Plus certified power supplies most likely won’t help you till 600W to 800W absolute best case (50% and 80% load.)
But wait, doesn’t that mean you only need a 500W power supply? NOPE! Not even remotely close. That’s the “running estimate” – but for startup we absolutely have to rate by maximum draw plus 5 (the plus 5 is rule of thumb.) So that’s 135+55+55+255 or 500W ignoring fans. We have to add another 15W for fans, that’s 515W. Oh, and that’s what the individual devices are drawing – that’s not a real number. It’s real in the sense that it’s the minimum startup wattage, but it doesn’t account for various losses. That 1% loss pushes it to 521W of DC supply required to start up.
We have to adjust and base off our actual efficiency versus wattage to compensate for typical AC-DC losses and startup draw. That gives us an actual need of somewhere north of 600W. Otherwise, we’re going to just pop the power supply any time we fire up Modern Warfare 2 or Battlefield. That presents… a bit of a problem. Not to mention the demonstration that the GTX480 basically goes from its idle wattage of “only” 47W (total system draw of 260W idle for SLI configurations!) and jumps over 50W just to open a webpage, and we’ve got a REAL winner here folks. Yep, and by the way, those numbers are extremely conservative and don’t leave any overhead at all. That’s what you’re going to see from the wall while gaming – 600W and higher. Oh, and don’t install any additional hard drives, attach USB devices, etcetera. In fact, ignore those numbers and go with HardOCP’s recommendation of minimum 700W for a single card.
Now, to be entirely fair, we need to establish a comparison. We’ll use the card that the GTX480 is supposed to “kill,” the AMD/ATI Radeon HD5870. The HD5870 has a maximum board draw of 188W. HardOCP found that the HD5870 system drew 367W at the wall which gives us an actual DC load of 320W. 320-188 gives us only 132W for the remainder of the components. So we’ll just call the HD5870 at 200W of draw after losses and everything, giving us a whopping 120W for a mid-range desktop board, heavily overclocked i7 920, and a SATA hard drive. So take your pick here, folks. Either these numbers at 200W are right, or the HD5870 is actually maxing out its DC draw at somewhere around 130W. Personally, I don’t have a hard time believing everything else at a combined 167W.
To be fairer, we have to do the same power math we did for the GTX480 to establish our power for startup – 135+55+55+193 = 438W DC for startup with a Core i7 1366. But wait! What happens if we switch it to a Core i5 or i7 Socket 1156, which is a TDP of 95W? That gives us 100+55+55+193 = 403W, and we’re running a 20W margin of error on both ATI and nVidia configurations. With that 20W margin of error plus 15W for fans, the ATI still ends up below 430W. In other words, if you didn’t mind having little headroom and running the PSU pretty hard, an ATI HD5870 can easily make do with a good quality 500W unit which will see a maximum draw from the wall of somewhere closer to 495W with everything at its absolute limit.
So! If we go with everything else at a combined 167W, let’s run the GTX480’s 480W number. Real DC draw is around 418W. We subtract the “everything else” category of 167W and get 251W in free air at a temperature of 93C. The free air part is very important, and we’ll get to that in just a little bit here.
Now here’s an exceptionally important point – HardOCP witnessed GTX480’s exceeding 900W at the wall in 2-card SLI when CPUs barely added into the mix, with an 87% efficient power supply. If we give them the full benefit of the doubt and say 250W for each GTX480, that leaves over 400W for the rest of the system. Let’s make our correction; 87% of wall is actual DC – that’s 783W DC side at 900W. We’ve already established that every other component combined is roughly 167W of draw. We’ll be exceedingly generous and jack those up to 200W. Notice how the numbers still don’t add up there, at all? Remember, it was OVER 900W at the wall and on an 87% efficient power supply at that! Seriously. Let me spell it out for you.
783 – 200 = 583+ / 2 = 291+ per card in SLI.
That means in SLI at 92C with fans screaming, those cards are actually drawing nearly or over 300W of DC, which translates to somewhere north of 650W at the wall. There’s some HUGE power losses going on there from heat, no doubt, since we’re talking about cherry picked cards from nVidia with non-release BIOS. These are, in other words, not actually representative of what AIB partners will be putting out. AIB partners will likely use lower cost voltage regulation and support components to try and handle the costs that are already non-competitive. If we presume that the card is a 250W combined part but gets 90% efficiency from supplied power, we get right around 270W. And as we’ve already covered, just the card is useless. Oh, and three way SLI? Dream on. At 2-way SLI, you’re pushing 1000W at the wall. There’s one 1500W PSU available on the market, it requires your outlet be wired for a 20A breaker, and it’s going to set you back $400+. Sorry folks, 1200W won't cut it - 900+250+ = 1150+. Oh, and then there's that little problem where your noise level is actually 64dBA for a single card and over 70dBA for two. Remember, dBA is logarithmic, so 64.1 to 70.2 is more than double. These things are dangerously loud and can make you deaf.
Now let’s complicate matters properly; HardOCP did all their tests on a bench in free air. This is a huge deal, because free air means that it’s not in an enclosed chassis. It has a continuous supply of cold air feeding it and completely unrestricted airflow from five directions. PCB and ambient heat is also indirectly radiated to open air independent of the fan movement. All this combined lowers the operating temperature substantially when compared to a card installed in a chassis. In other words, the 93C operating temperature is very much on the low side. This is why nVidia was requiring manufacturing partners to certify their chassis beforehand. When you put these cards into a a chassis, they’re suddenly faced with restricted air flow, the loss of ambient cooling, and the addition of over 100W of ambient heat from CPU, motherboard, hard drives, etcetera. Very very few destktop chassis are 100% thermally efficient – that being, it rejects its entire thermal load and maintains the interior temperature at intake air temperature. I have built and worked on some of the most efficient there is, and typical users are going to have chassis that with a 200W TDP video card, is going to be no less than 15C above exterior ambient (or deafeningly loud.)
Now we have a real problem, because that means we’re running at the ragged edge as is. If we call exterior ambient at 74F that gives us an ambient of 23C. If we call it 15C, that gives us an interior ambient of 37C or about 97F. In free air testing at HardOCP, 74F ambient isn’t an unreasonable estimate and is actually probably high. So end users will be applying an ambient temperature 15C higher than the temperatures that let a GTX480 run at “only” 93C. With the loss of ambient thermal radiation, and airflow restriction from components and the chassis, plus an additional 150W+ of added thermal load applied unevenly to all fans… well, you can bet that a GTX480 will never be quiet, and it will be screaming as it tries to maintain the die at 100C or below. This is why I do NOT like free air noise testing. Yes, it tells and shows you just how loud the fan is, but only in free air. Typical users will have these parts in a chassis, which can and will have significant effects on the temperature and cause the fans to spend more time at higher speeds. In fact, it will affect all fans in a modern chassis.
I don’t particularly have a horse in this fight other than my standard policy of “if it doesn’t work, if it’s not the better part, then I don’t want it.” The GF100 fails both of those, miserably and with great gusto. The performance numbers aren’t compelling at the price point, even if ATI doesn’t cut prices on the 5800 family parts. The power draw, heat, and noise generated add up to something I could even consider putting in a desktop system. Nothing short of watercooling is going to get that noise and temperature under control. Even the Arctic Cooling HD5870 part that they rate to 250W dissipation can’t do it (in part because it doesn’t exhaust outside the chassis, but onto the card instead.)
Not to mention the fact that they’re putting a 250W part totally dependent on game developers playing ball for its performance, up against a 188W part that in most situations offers equal or better performance. To justify a 250W, $499 part over a 188W, $410 part you’re talking about around a 30% performance jump needed. But nVidia delivers somewhere around maybe 5% except in tests written specifically for the card, or a 197W part. It’s only worse when you stack up the HD5850 at 151W versus the GTX470 at 215W – nevermind the fact that it’s a $350 part versus a $280 part. Again, same thing, 30% jump needed to justify the price and power, and it’s just not there.
So with all these numbers and all this math right there, why don’t the review sites point this out? Simple; because they don’t want to piss off the people who feed them hardware. They have to leave doing this math as an exercise to the reader, because pointing out design failures like this in detail will lose them access to the hardware. Especially with nVidia – they’ve deliberately cut off and retaliated against sites that refused to lie for nVidia’s benefit.
So, there you have it. The GTX480 is a 400W+ card and the 250W draw is debunked. Where’s all the power going? Ask nVidia – they’re the ones who’ve delayed GF100 multiple times and been having issues with leaky transistors and having to jack up the voltages. I’m not an electrical engineer, but I can do basic math, and that’s all you need to see that the GTX480 definitely goes in the design failure column along with the NV30 series (AKA GeFarce FX AKA DustBuster.) This isn’t a card I could recommend, much less sell. And hopefully you’ve learned a lot more about desktop system design while I ripped it apart.