Electric companies after every nvidia gpu generation.
Donald Duck Gold.jpg
3090 TDP 350 W
4090 TDP 450 W
5090 TDP 575 WNvidia has been pretty shit for the last few generations but this is clearly just an engineering prototype for testing, they obviously weren’t trying to make a 2400w 5090.
don’t put it past nvidia.
they gotta fulfill those unrealized promises about native 4k gaming somehow.
Good luck plugging that into an outlet without tripping a fuse under load.
10A. That’s fine.
I’m not trying to be a 1-upper, but you need to also include the power the computer and CPU use as well. Not to mention the age of the outlet and how many times a plug has been inserted/removed from it. The contact resistance can be pretty bad depending on how old the outlet is.
Christ, not exactly a model of power efficiency is it?
Also, if it’s drawing that much power, how could it possibly dissipate all the heat? It must sound like an F-16.
I expect this card will be a hard pass from me…
Or maybe it just delivers 600W without burning the ever-loving hell out of the connectors.
Almost has to be. 2400W would put it completely outside the consumer market. Consumer PSUs don’t go that high. Home power outlets don’t go that high unless you have special electrical work done. I can hardly imagine what a cooling system for a nearly 3KW system would look like.
In Europe, this is no biggie
I just saw a reputable 2400W kettle on a random online store for 50€
Looks like there are 3000W options too
Just imagine the costs of running such a system on European energy prices. We’re at ~0,35€/kWh here in Germany currently. That means that an hour of running this will cost you 0,84€. Add to that the energy use of the CPU, mainboard, Monitor and you’re paying well over 1€ per hour of gaming.
Judges you from French 0.20€/kWh nuclear prices
If only you guys had listened to the science… You’d be gaming AND heating your place for cheap!
And regardless, unless the chip is radically different from what has been observed in currently available RTX 5090s, I don’t see how 2400W can be anything other than a transient spike
Let’s not start a discussion about nuclear energy here. France has enormous subvention on electricity and Germany a lot of taxes. And both countries have issues in their energy system, so yeah
France has taken away electricity subventions a long time ago, they were temporary relief during COVID only.
In fact, there are pretty high taxes here too, just the base cost is lower.
And I started this debate to challenge the notion that all of Europe has Germany’s electrical management issues; they’re the main ones to have failed.
I think the typical limit is around 3600W, with 16A at 230V
Oh! I knew European outlets operated at higher voltage, but I didn’t know the standard circuits supported such high current. Jealous!
It’s the same current but double the voltage
And wiring is typically rated for current limits not voltage (within reason). Some 12 gauge wire doesn’t care if you’re pushing 12V, 120V, or 240V but is only rated for 20A.
The easiest way to think about it is that the conductor is rated for the current, and the insulator is rated for the voltage. Now, once you get into the nitty gritty, they’re more intertwined than that, but it’s close enough for a surface level explanation.
I live in a 50 year old house. All the breakers are 16A, so 220V x 16A = 3.5kW
The electric sauna does three-phase @ 400V. My energy tracker usually peaks around 9.5kW when it’s heating.
Most are actually 230V which is even more at standard 16A, 3680W to be precise.
Countries that use 110V have so many weird limitations that we don’t even know in Europe. For them, 230V is the “special” outlet for special purposes.
Actually, in the US the outlets are often wired with 1 leg, while giving 2 legs gets you back to 240v.
110 is probably better in terms of general safety (which is good because our houses are death traps), but it means when you do need power you need a special circuit.
We should have both more common, but the plugs are terrible (basically they turn the left prong 90 deg).
when you do need power you need a special circuit.
We also have a standard socket and a high power socket.
Expect our normal outlets provide 230V 16A 3.5kW (3kW sustained) and the typical high power outlets outlets provide 400V 30A 11kW or 400V 60A 21kW.
Which is why typical electric stoves here use 11kW and typical instant water heaters use 21kW.
Though probably the most noticeable advantage is in electric car charging.
110 is probably better in terms of general safety
Eh, not really. There is no significant difference in safety between 110vac and 230vac. Voltage is not the (most) dangerous part, it’s the amps that kill if you’re electrocuted.
Nominally EU voltage is 230V, and may be 240V. In fact, it can be as high as 230V +10% = 253V. Higher voltage means more power for a given current, so nominally it’s 16A x 230V = 3.68kW, but you could have say 16A x 250V = 4.0kW.
If your sauna is 400V then it sounds like you’ll be 230V (400V / sqrt(3) = 230). But the voltage can also be 230V -6% = 216V, so 220V is within scope.
But yeah, standard voltages in the EU are either 230V/400V or 240V/415V. They’ve been harmogenised, but if you look at the numbers you’ll see the trick - 230V +10% is roughly the same as 240V +6%. So the range is 230V-6% and 240V+6%.
You’ve got a 3 phase connection though so you might find you’ve got different single phase breakers on different phases (eg lights on one phase, sockets on another), with slightly different voltages for each one.
The installation in my home follows my country’s regulations as they were ~15 years ago. It’s divided into several circuits, the ‘general use’ outlets one is rated for 25A in total AND at any point, ie you could plug a 5750W appliance in any of those outlets. The lights circuit is the lowest rated at 15A, still letting you ‘plug’ up to 3450W.
3600W is the maximum a power socket is rated for and the fuse triggers at 3800W. So, cutting it pretty close.
I wouldn’t use that kind of power continuously. AFAIK the sockets are supposed to handle 16A for at least six hours, when they are new. When charging your car on Schuko sockets it’s good practice to limit it to 10A and check for the socket temperature after a while. Also, any connections in the cabling can have increased resistance with age and heat up with heavy continuous use. That shouldn’t matter that much when running a kettle or toaster for a few minutes, but charging a car or gaming for hours can become a problem.
What about the rest of the computer though?
3840W per breaker. Minus 2400 leaves 1440W, for a CPU, the minor components, and monitors/other equipment. In theory it could work.
You would still need to run the computer off multiple plugs, as almost any 240v plug is 10a.
You’d likely need a dedicated breaker and plug, similar to a stove plug.
All UK plugs are 13A.
Here, plugs are 230V and 16A = 3680W. Not quite as much as I thought (most extension cords seem to be rated for a bit more, which makes sense), but definitely enough to run monitors of the same breaker.
I’m gonna oneup those kettles with >7500W showerheads
Standard US outlets can’t deliver 3000 watts.
That’s why I started my sentence with “In Europe”
But I live in America so naturally you’re referring to US outlets, right??
Yes, your are right, here is a map of Europe where you have chad power plugs in a-Murika (outside these towns only virgin plugs):
God forbid I supply information. I’m fucking done commenting with all you over sensitive weirdos who think everything is a fucking argument.
nVidia cares less and less about the consumer market every year. We basically only exist to buy the factory fourths so that the overall yield of any given wafer can be maximized.
2400 for a single component is still rather insane even by server room standards. But 12 or even 18 load balanced? That starts to “make sense” for higher end data centers or even on-prem server rooms at the more tech oriented companies.
Yup, I presume this is their answer to the cables burning. Divide the wattage between more wires
The real answer to the burning cables is to divide the wattage between the six wires on a single connector, which most of the 50-series cards don’t do that. That results in ~15 amps across a single scorching cable.
They could have just used normal 8 pin connectors in that case.
deleted by creator
That’s almost 1W per dollar!
I, for one, would rather just see them use a couple of 2/0 AWG welding cables, bolted onto a 5mm copper plate on the board. If you need 200 amps, make it look like 200 amps.
PCI bus bar on top.
With 2.4kw you can just use it as a space heater, and a strong one at that!
If they put 2x 12pin HighPower connectors and they wouldn’t be burning up because each would be delivering just 300W. But they explicitely don’t allow board partners to do it themselves, because NVIDIA is bunch of controlling assholes.
They could even used it as part of marketing. “So powerful it needs 2 connectors”
If they need 2x12pin connectors they could go with the standard mobo ATX24, stable, sturdy… hell, at this (powerl) point they could just use a 220V cord.
Are there many consumer PSU’s with two ATX24 connectors? A quick search only shows Phanteks making one.
Nvidia controls enough of the market that more PSU manufacturers would if Nvidia needed it.
deleted by creator