Was bedeutet der hohe Stromverbrauch einer RTX 5090 für mich?

Ich würde gerne wissen, was mich der hohe Stromverbrauch kostet. Ständig höre ich 650 Watt verbraucht die RTX 5090.

Wie viel Stromkosten müsste ich in Baden-Württemberg bei der EnBW oder Stuttgart Netz zahlen? Nach einer Studie spielen Gamer im Durchschnitt 4 Stunden am Tag.

Darauf basierend würde ich eine Kaufentscheidung treffen.

(1 votes)
Loading...

Similar Posts

Subscribe
Notify of
11 Answers
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Comp4ny
2 months ago

You cannot put your purchase decision on the graphics card alone. Because what you say is maximum consumption, not average consumption, and that is significantly lower.

In addition, the remaining components must also be taken into account.

Either way, a PC is an expensive pleasure, and honestly you don’t need RTX 5090. What for?

Such a graphics card also requires a solid motherboard and the matching CPU. For these three components alone, you’re on average alone at around the current price level and when the card is out, loosely flocculent at about 1500 – 2000 €.

In addition, the power supply should also have a minimum of 1000 to 1200 watts.

RedPanther
2 months ago
  • 4 h * 365 days = 1460 h per year
  • 1460 h * 650 W = 949,000 Wh = 949 kWh
  • 949 kWh * 0,35 €/kWh = 332,15 €

That’s the way.

  • Whether you’ve run your GraKa 4h a day on full load, you know.
  • If you’re shocking 365 days a year, you know.
  • If you pay 35 ct/kWh, you know.

Accordingly, your cost calculation deviates from my numerical example.

Based on this, I would make a purchase decision.

It doesn’t make sense.

Once you need a 5090, you’ll be shocked. With, for example, a 4090. And their power consumption is not so much lower now, i.e. with the 5090 only a little more electricity costs are incurred than before.

In addition, the remaining PC also consumes electricity; possibly almost twice as much as the GraKa alone. You won’t sit on a 14100T in a jewelleryless case with only two fans.

And last but not least, who can afford a system in which a 5090 fits in, the tick is so important that a few 100 € annual electricity costs are likely to be quite expensive.

Surbasax
2 months ago

650W at 4 hours/day and 0.30€ per kWh

per day 0,78€

per week 5,46€

per year 284,89€

Or let yourself be reckoned;)

https://computeronline.de/plugs/current costs.php

Maik2325
2 months ago

Aside from that, an Otto normal citizen wouldn’t buy it, because even I know that the card is going to cost just $2,000.

In any case, you have to buy a new power supply, and so I mean a 1000watt.

Of course, you won’t use the full wattage, but if you are. video rendering, it might be good that you get to the 500Watt very well.

You’re asking JETZT after the electricity costs?
Then the question is whether you would buy the card is actually answered.

No one who cares about electricity costs would buy such a ridge.

Furo0815
2 months ago

If you want to use the PC not only for gaming, the high idle load of almost 50 watts is also a theme that is almost twice as high as the idle consumption of a 4090. I don’t know why the Idle pulls so much, but was measured by Gamers Nexus and this is one of the few competent tech youtuber I trust when it comes to such measurements.

AlbertHnsl
2 months ago

Not only does it draw the graphics card, but also the rest of the PC. I’m going out here from a total of 1000W.

Here’s a electricity cost calculator where you can enter everything and then get a result: https://www.stromkonsuminfo.de/stromkostencomputer.php

For 1000W performance, which will be claimed 7 days a week for 4h hours each, at a price of 0.33 euros per kWh, this will cost 480 euros per year.

However, this maximum power of 1000Watt is also only retrieved when the PC runs under full load, which is not permanently the case in practice

Anson12
2 months ago

Due to the fact that new graphics card has the more power, the power consumption will also increase. Rtx 4090 is already over 450 watts. You can assume that the 5090 needs at least 500 watts. With 650W power supply you have no chance for

SirLucifer97
2 months ago

https://www.youtube.com/watch?v=JpPyEBuGSzs

Used in their test in cut ~520W. So have ~2,1kWh ONLY for the graphics card in the 4 hours. You can calculate the rest if you look for their current prices.

Jensen1970
2 months ago

1000Watt per hour 40Cent.

https://computeronline.de/plugs/current costs.php

zocker0796
2 months ago

1000 watts in 1 hour cost 40 cents. Just make citizen money.

Comp4ny
2 months ago
Reply to  zocker0796

Nonsense.

Electricity costs must be paid from the rule rate and not part of the KdU.