Posts: 1,586   +47
Staff
Rumor mill: Recent reports have sparked concern that Nvidia's upcoming series of graphics cards might consume significantly more power than the RTX 4000 series. A breakdown of MSI's latest power supplies has fueled speculation that Nvidia's new flagship might handle the extra load with two power connectors.

Nvidia's current flagship GPU, the GeForce RTX 4090, is already infamous for its issues with power supplies. A new product lineup from MSI suggests that its successor, likely arriving next year, could invite further PSU controversy.

TweakTown recently saw MSI's new high-wattage MEG Ai1600T PCIe 5.0 and MPG A1250 power supplies at the company's Shenzhen factory. Surprisingly, they include two 16-pin 12V-2x6 power connectors, and MSI says it added the additional plug to accommodate unnamed next-generation GPUs.

Nvidia, AMD, and Intel are expected to introduce new generations of graphics cards between late 2024 and early 2025, but recent rumors have singled out Nvidia's upcoming lineup for increased power consumption. Its upcoming flagship – likely called the RTX 5090 – could draw up to 600W, a dramatic increase over the 4090's 450W. The GPU's smaller brethren might also consume more power than their predecessors.

If the reports prove accurate, splitting the higher wattage across two cables might prove safer, potentially helping Nvidia avoid the PSU issues plaguing the 4090. Since late 2022, RTX 4090 owners have complained that the GPU fries power connectors.

Nvidia concluded its investigation by attributing the issue to user error, having acknowledged only a few dozen cases. However, as of April, repair shop Northridge Fix claimed to be receiving hundreds of melted 4090s per month, indicating a fundamental design problem.

If Nvidia's solution is to distribute the load between two cables, then anyone purchasing an RTX 5090 might also need to add the cost of a high-performance PSU to the GPU's price tag, which could exceed the 4090's $1,600 MSRP. Whether other RTX 5000 graphics chips require upgrading over existing ATX 3.0 power supplies remains unclear.

However, MSI's new offerings boast impressive stats for customers shopping for a new high-end PC build. The 1,600W MEG Ai1600T received the company's first 80 Plus Titanium rating. Moreover, users could theoretically use it to power two lower-end GPUs.

According to the latest information, Nvidia is expected to introduce RTX 5000, codenamed Blackwell, at CES 2025. The RTX 5080 could launch before the 5090.

Permalink to story:

 
Well, similar to this generation, the xx90 series seems to make no sense other than if you're an influencer getting paid for your shill.
 
If so, it means they are squeezing the very last for this performance.
So, maybe AMD just isnt trying hard enough?
 
Sounds like a lack of planning or understanding on the engineer's part .... unless of course the management decided to interfere and not listen to their own experts .... I'm betting on the 2nd alternative .....
 
At this point it would be a better solution for the card to just have its own external power brick, like 3dfx did with the Voodoo 5 cards in the 2000s.
 
I wouldn't want this card even if they gave it to me for free.. or well... id take it to just sell it :D
 
I'm soooo surprised.......said absolutely friggin NOBODY !

This sounds like a conspiracy between nGreediya and the psu mfgr's to get people to spend major $$$ just for the privilege of having the "latest & greatest" :(

Reminds me of that saying: "absolute power corrupts absolutely"

Not that Jacket Man gives a crap about you or whether or not you can pay your electric bill (which will go up exponentially if you run a 5xxx card & it's requisite 1.5-2kw PSU) AND put food on the table, just as long as you allow him to buy even more new jackets !

But anyways, maybe adding the 2nd connector IS the safer alternative for these hungry AF cards !
 
I love all the young pups here who freak out over GPU power use "waaah muh watts noooo you cant", then nvidia puts two connectors to split the load and everyone "waaah nooo they cant its too much the PC is over aaaaah".

None of you were around in the SLI days, when you had dual PSU cases, 2kW power supplies, and supplemental power supplies that fit in a 5.25" drive bay to feed another 450 watt to your GPUs. Look up the power use of a watercooled 590 then double that for a quad SLI setup, with another 570/580 for physX and an OCed i7 980. You'll have a conniption.

I'm soooo surprised.......said absolutely friggin NOBODY !

This sounds like a conspiracy between nGreediya and the psu mfgr's to get people to spend major $$$ just for the privilege of having the "latest & greatest"

Reminds me of that saying: "absolute power corrupts absolutely"

Not that Jacket Man gives a crap about you or whether or not you can pay your electric bill (which will go up exponentially if you run a 5xxx card & it's requisite 1.5-2kw PSU) AND put food on the table, just as long as you allow him to buy even more new jackets !

But anyways, maybe adding the 2nd connector IS the safer alternative for these hungry AF cards !
If you're worried about power draw, you cant afford a gaming GPU. Period. People who buy ferraris and whine about gas prices, or people who buy mcmansiosn then whine about property taxes, get laughed at for a reason.
 
I'm soooo surprised.......said absolutely friggin NOBODY !

This sounds like a conspiracy between nGreediya and the psu mfgr's to get people to spend major $$$ just for the privilege of having the "latest & greatest" :(

Reminds me of that saying: "absolute power corrupts absolutely"

Not that Jacket Man gives a crap about you or whether or not you can pay your electric bill (which will go up exponentially if you run a 5xxx card & it's requisite 1.5-2kw PSU) AND put food on the table, just as long as you allow him to buy even more new jackets !

But anyways, maybe adding the 2nd connector IS the safer alternative for these hungry AF cards !
Well, the other option is to NOT buy one. Nobody is being held at gunpoint here.
 
No 5090 won't require 2x16 pin. Maybe some beefed up custom cards will have 2x16 pin but won't utiltize it all, or even close.

All these rumours are because some new 1600W PSUs have 2x16pin for GPU. Does not mean that 5090 will >require< 2x16pin.

5090 will use alot of power tho, around 550 watts at stock but it will be an absolute beast for years to come. People that need this kind of power, don't care if it uses 400, 500 or 600 watts.

5080 will deliver 4090 performance for less, 1200 dollars, 350 watts. It is all it needs to do really. AMD is not competiting, so this is what we get.

If Nvidia was feeling the slighest heat from AMD, they would have gone with TSMC 3N instead of re-using 4N aka 5nm - AMD left high-end GPUs tho. RDNA4 looks like a joke already.

The good thing about this, is that my 4090 keeps being a top GPU for 2 more years. The 1500 dollars I paid on release, was a great price considering I will be looking at like 5 years of top tier performance, when I upgrade to 6090/6080 in 2026/2027.
 
Hmm let me think....spending $150-$200 on PSU vs. having $1500 GPU fried... gee, tough choice...
 
Leatherman disapproves of your poverty. If you can't afford a 90 class card then you don't deserve to game.
While we all know your joking, the truly sad thing is, there's some truth to what you say. Jensen has completely lost touch with reality and needs a smack down.
 
Last edited:
Well, the other option is to NOT buy one. Nobody is being held at gunpoint here.
Not to mention that most people either can't afford it, or don't want to.
 
If one was not enough problems....
 
No chance 5090 needs 2 times 16 pin!! Nvidia is not that stupid.

A single 16 pin can deliver 600 watt with up to 1800 watt spikes, this is plenty even for factory overclocked 5090s. The Founders Edition is rumoured to be 550 watt!!
 
If one was not enough problems....
Radeon 8000 will use 16 pin as well. This is the future for GPUs!

My 4070 Ti SUPER is running flawlessly with 16 pin! No way I am going back to 2-3 times 8 pin!!
 
Are you sure?
Yeah pretty sure!

After all 4000 series is alot more efficient than AMD 7000 series!
My 4070 Ti SUPER averages at 250 watts and is running cool and quiet! Still yet to see GPU hit more than 60C and dB is low.
 
That was a rhetorical question that implied NVidia isn't that smart.
I guess they are considering they dominate the GPU market with ease!
 
I guess they are considering they dominate the GPU market with ease!
That doesn't mean that all the decisions they make are very, if at all, smart.
 
"Hey Ma, I'm turning off the AC cuz I'm turning on my computer."
<click> <whrrrrrrr> <Vrooooom>
Wait! I need the AC because the PC gets the room too hot. <dilemma>
 
Hmm let me think....spending $150-$200 on PSU vs. having $1500 GPU fried... gee, tough choice...
You won't get a PSU with 2x16 for 200.
 
Well played, TechSpot, the ragebait has worked perfectly.
 

Similar threads