• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Associate
Joined
18 Apr 2010
Posts
389
I’m looking forward to seeing if I can push max frame rate on the new Alienware qd-Oled monitor with this! Will be a good year for hardware upgrades
 
Soldato
Joined
31 Oct 2002
Posts
8,659
Looking forward to the 4000 series. I'm hoping for a 4080 with 16GB VRAM, 280W TDP, that's 30% faster than a 3090. That would suit me well.

Hope we don't see TDP increases on the whole range again, we need efficiency increases, though this is extremely difficult to achieve.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
https://www.guru3d.com/news-story/g...e-up-to-750wit-would-arrive-in-september.html

There are fresh reports concerning the Geforce RTX 4090, centered on the TDP and release date. On the one hand, the card should provide 2.5 times the performance of its predecessor, but it may also process twice as much energy.

The GeForce RTX 4080 Ti and RTX 4090 will demand much more power, 750W up to 800W in the latter's case. Greymon55, a leaker who anticipated the introduction of a big number of current AMD and Nvidia products, revealed the first information about it. According to what he revealed, the RTX 4000 will be available in September, with TGP (Total Graphics Power) ratings of 450W, 650W, and 850W, although he emphasized that these are not final specs and may change. The information was confirmed in the comments area by another well-known leaker, "kopite7kimi," who routinely gets his allegations accurate. Although it is a rumor, he claims to have heard that the RTX 4080 would consume 450 watts, the RTX 4080 Ti will consume 600 watts, and the RTX 4090 will require 800 watts, implying that this has already been confirmed by numerous sources.

Following these revelations, it seems significantly more logical to implement the new PCI-E 5.0 GPU power connection, which enables up to 600W per connector. We can power an RTX 4090 with just two ports with this adapter, but with regular 8-pin PCI-E connectors rated at 150W per connector, we would need five or even six of these connectors, which is clearly not feasible.

This may not be an issue in the United States, where a kilowatt-hour of power costs an average of 11 US cents. However, in Europe, many individuals will use a slide rule. Here, an average of 35 euro cents is expected.

According to another story, we'll find out in September. Because that's where the Geforce RTX 4090 is expected to emerge, most likely alongside a few Lovelace products. Time will tell ...

Although I recently purchased a 1000W PSU, for these short spikes, I'm looking for at least a doubling of the 3080's RT performance running under 300W for £750 or less. Otherwise I'll be sticking with my 3080 until we can acheive that.
 
Associate
Joined
22 Nov 2018
Posts
2,408
Is it a deliberate increase in power to make them not worth mining on?

I think the power increase is so they can be clocked higher to beat the competition.

If AMD weren't competing at the high end then Nvidia would lower the voltage and the clock speed. The heatsink would be less chunky. Power wouldn't be an issue.

The solution is to fix Nvidia 4000 using MSI afterburner.
 
Soldato
Joined
22 Oct 2008
Posts
10,920
Location
Belfast
Upgrade when you need to. If games are running poorly at the moment then why wait? Why be miserable because your framerate is so bad?

However, if games are running smoothly then wait until they don't run smoothly. Don't upgrade for the sake of it. If your current card is still running smooth when Nvidia 4000 launches then there's no point in upgrading.

Upgrade when YOU want to. Not when Nvidia releases something and they want you to upgrade.

10000000% this
 
Associate
Joined
2 Dec 2020
Posts
251
Location
Boarding a Rocket Ship
Upgrade when you need to. If games are running poorly at the moment then why wait? Why be miserable because your framerate is so bad?

However, if games are running smoothly then wait until they don't run smoothly. Don't upgrade for the sake of it. If your current card is still running smooth when Nvidia 4000 launches then there's no point in upgrading.

Upgrade when YOU want to. Not when Nvidia releases something and they want you to upgrade.
This 100% I'm running a 1080ti on custom water, everything on custom water.
Im at 1440p no need to upgrade yet whatsoever however when i do upgrade i will jump to the 3090. Dont know when that will be, could be 2-3 years yet
 
Associate
Joined
27 Sep 2008
Posts
926
This 100% I'm running a 1080ti on custom water, everything on custom water.
Im at 1440p no need to upgrade yet whatsoever

I'm running 1440p with a 1080Ti on "standard air", everything on standard air. :D

See no reason to upgrade whatsoever.
 
Soldato
Joined
18 Feb 2015
Posts
5,934
Rumour is September release date, I've not found anything concrete yet beyond a screengrab of an offer

I guess I should wait for the 4000 series? I have a 1080ti, and I can get a 3080ti.... but why get now when 4000 is coming?
No one really knows for sure, all we have are whispers from leakers. Even at the end of the year I wouldn't expect a big launch or much availability, I think it's really 2023 when we get a real chance at buying a new card.

Looking forward to the 4000 series. I'm hoping for a 4080 with 16GB VRAM, 280W TDP, that's 30% faster than a 3090. That would suit me well.

Hope we don't see TDP increases on the whole range again, we need efficiency increases, though this is extremely difficult to achieve.

You've read my mind. That much performance would pretty much cap 60 fps Cyberpunk 2077 at 4K DLSS Performance w/ all RT running. 280W is probably too little though, I expect 350W, and they'll probably skimp on vram and keep a 320-bit solution w/ GDDR6x. Hopefully though they'll offer a 20 GB version too, as that would be available now for them unlike for Ampere's launch (2 GB vram modules). Optimistically that would be $999, but I think that might be too low. Not sure they'd leave such a big gap between a 4080 & 4090, because I expect a 4090 to easily start at $1999.
 
Associate
Joined
4 Feb 2009
Posts
1,249
450+ Watts. I'm struggling to imagine buying something like that. 800W just feels insane. That would put the entire PC at about 1KW - one of the most power intensive objects in the house. Kettle, 2.5KW. Hoover, 2KW.

Am I the only one looking at this thinking "That's insane. You are joking, right??"?
 
Soldato
Joined
19 Jan 2010
Posts
4,014
I have a 3090 and will no doubt move it out of the way for a 4090 just for the shills. I wanted 60fps at 4k and now I've got that I want 120fps at 4k. I want I want I want
 
Associate
Joined
25 Apr 2017
Posts
848
I really hope they launch the 4080 Ti in September as well. Hate waiting a year for the Ti and the 4090 would likely be $3000.
 
Soldato
Joined
22 Oct 2008
Posts
10,920
Location
Belfast
With the cost of electricity getting out of control, running costs will be a HUGE thinking element this time around for people. Wheras before people, and lets be honest here, didn't care about running costs, they just wanted the fastest GPU they could get.

Next gen, with the cost of leccy going mental, you'll need to consider running costs for a 500, 600, 700W card and if it'll be adding a signifiant amount on your monthly leccy bills, AFTER you buy the card.

Buying the card to start with will be brutal to your bank account, then adding potential of £200+ a year in additional leccy costs too, that'll be in peoples thinking for the first time that I can ever remember.
 
Soldato
Joined
27 Jun 2006
Posts
11,264
Location
Not here
I have a 3090 and will no doubt move it out of the way for a 4090 just for the shills. I wanted 60fps at 4k and now I've got that I want 120fps at 4k. I want I want I want

Same here, then again. I got my 3090 on release and haven't used it much. Probably not a good idea for me to rush out and buy the 4090 like before.
 
Top Bottom