• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Associate
Joined
21 Apr 2007
Posts
2,076
AMD and Nvidia are just trading and poking stealth PR jibes via social media atm which is getting picked up on in the tech channels, its all just opinion and speculation atm imho.

The only thing we can be reasonably certain on is AMD having some tech advantage that is causing concern to Nvidia by the way they are behaving the and news coming out.. as to how it all plays out in reality we simply don't know what we've seen in recent years is more and more monopolistic and cartel behaviour than anything else which concerns me because all it means to consumers is choice not competition.
 
Associate
Joined
22 Mar 2015
Posts
241
Location
hamshiretown heathamletonhurst
Hear hear, as is tradition. "Rumors" equate to carefully planned pre-marketing drops according to a playbook, and have done so for years. There's no shadowy engineer dropping (well-timed) gems for giggles. It's hilarious how people keep missing the patterns every cycle, many who've followed this market for years.

And whether it's intentional or not, a two-company market will perform in a monopolistic equalibrium by nature. We need more players to spur competition.
 
Soldato
Joined
21 Jul 2005
Posts
16,750
Location
N.Ireland
People thought that would happen this year with Intel.. still waiting for the dGPU's to make that difference to the prices!!

Waiting-Skeleton.jpg
 
Soldato
Joined
22 May 2010
Posts
8,300
Going by the leaks so far it seems the performance difference between a 4080 and 4090 is going to be much more different this time round. If its true it looks like the 4080 is going to be more like a 3070 with the slower memory GDDR6 instead of GDDR6X and a different chip altogether compared to the 4090. This will leave room for a 4080ti which will be a cut down 4090 which is what a 3080/3080ti is right now to a 3090 if that makes sense?

Wonder how that will be priced then if a 4080 is gonna be configured like that.

Only going by what i've read and watched so far.
 
Associate
Joined
20 Aug 2019
Posts
2,181
Location
SW Florida
I have taken naming schemes with a grain of salt since Turing.

I want to know how it performs and what it costs. How much performance and at what price point?

After living with a 400w 3080Ti for a while, I am also interested how much power a given card needs in order to provide x amount of performance.
 
  • Like
Reactions: TNA
Soldato
Joined
21 Jul 2005
Posts
16,750
Location
N.Ireland
Unless there is killer performance like the 2x touted by the rumour mill, its not good if the power requirements are 1.5x and you need a new PSU. The price of electric is not great either so a good chunk of people wont bother unless its an attractive worthwhile option.
 
Caporegime
Joined
8 Sep 2005
Posts
26,884
Location
Utopia
Unless there is killer performance like the 2x touted by the rumour mill, its not good if the power requirements are 1.5x and you need a new PSU. The price of electric is not great either so a good chunk of people wont bother unless its an attractive worthwhile option.
If there is one thing history has repeatedly shown it is that each time this situation crops up, with newer more power hungry cards, it's that enthusiasts who can afford high-end cards WILL bother as they chase performance and don't give a crap about new PSU or electricity costs. Saying people won't buy them unless there is a 2x performance increase is also silly as that is just not realistic or expected; people will buy the cards for a +50% performance increase which is still very respectable and a huge boost in minimum frames at higher resolutions.
 
Associate
Joined
13 Jan 2018
Posts
1,057
Going by the leaks so far it seems the performance difference between a 4080 and 4090 is going to be much more different this time round. If its true it looks like the 4080 is going to be more like a 3070
This is what i thought. The lineup is shaping up more like turing, with the 4090 the only card having a decent performance bump over previous gen. Considering the 3080 was non existent this time round having the 4080 dumbed down this gen makes sense
 
Soldato
Joined
7 Dec 2010
Posts
6,171
Location
Leeds
This is what i thought. The lineup is shaping up more like turing, with the 4090 the only card having a decent performance bump over previous gen. Considering the 3080 was non existent this time round having the 4080 dumbed down this gen makes sense

Also the new rumours state the 4070 is about 20% faster than a 3070, again latest rumour from the usual suspects so take with a pinch of salt.

The 4080 is going to be on AD103 this time not 102 chips like the 3090,3090ti and the 3080ti, Turing 2080 was on TU104. So basically they are bringing out a new desktop 103 chip for the 4080, we do have a GA103 this generation but that's only used for the 3080ti laptop chip so far.
 
Soldato
Joined
21 Jul 2005
Posts
16,750
Location
N.Ireland
Saying people won't buy them unless there is a 2x performance increase is also silly as that is just not realistic or expected; people will buy the cards for a +50% performance increase which is still very respectable and a huge boost in minimum frames at higher resolutions.

Have you not read the threads on here where they are mentioning 2x (obviously committed to the Ada gen anyway), it will be laughable if your only gaining 50% performance, when its likely to be a % more in price and +50% more power consumption.

 
Last edited:

TNA

TNA

Soldato
Joined
13 Mar 2008
Posts
21,664
Location
London
Also the new rumours state the 4070 is about 20% faster than a 3070, again latest rumour from the usual suspects so take with a pinch of salt.

The 4080 is going to be on AD103 this time not 102 chips like the 3090,3090ti and the 3080ti, Turing 2080 was on TU104. So basically they are bringing out a new desktop 103 chip for the 4080, we do have a GA103 this generation but that's only used for the 3080ti laptop chip so far.

So the 4080 will be essentially the 4070 disguised as a 4080 so they can charge more?

Like someone else said above, that is why I just look at price for performance rather than names. Also why I don’t go searching for rumours anymore and just see them here. Most, if not all are fully of ****. We will see what’s what when the cards are out.
 
Caporegime
Joined
4 Jun 2009
Posts
26,904
So the 4080 will be essentially the 4070 disguised as a 4080 so they can charge more?

Like someone else said above, that is why I just look at price for performance rather than names. Also why I don’t go searching for rumours anymore and just see them here. Most, if not all are fully of ****. We will see what’s what when the cards are out.
Exactly, couldn't care what number is on the gpu, just give me something that beats my 3080 in RT by a good amount for <£700 is all I ask.
 
  • Like
Reactions: TNA
Caporegime
Joined
8 Sep 2005
Posts
26,884
Location
Utopia
Have you not read the threads on here where they are mentioning 2x (obviously committed to the Ada gen anyway), it will be laughable if your only gaining 50% performance, when its likely to be a % more in price and +50% more power consumption.

2x raw performance over a 3090 is unrealistic. If it is 2x performance it will likely be in specific areas of performance eg: Ray Tracing or similar.
 
Soldato
Joined
21 Jul 2005
Posts
16,750
Location
N.Ireland
Also the new rumours state the 4070 is about 20% faster than a 3070, again latest rumour from the usual suspects so take with a pinch of salt.

Lets use these hypotheticals..

If the 4070Ti is on par with a 3090 yet is uses more watts, the only improvement they can justify will be ray tracing. Doesnt sound as much of a leap then does it?
 
Soldato
Joined
22 May 2010
Posts
8,300
This is what i thought. The lineup is shaping up more like turing, with the 4090 the only card having a decent performance bump over previous gen. Considering the 3080 was non existent this time round having the 4080 dumbed down this gen makes sense
Begs the question, what are they going to charge for the 4080, 4080ti (inevitably to fill the big gap) and 4090. If by going on the rumours and leaks is true then charging £650+ for the 4080 will be an absolute rip off this round lol.
 
Associate
Joined
1 Oct 2020
Posts
593
Everything is getting more expensive, not entirely sure how GPUs are going to avoid it. Also, the dumbing down of the 4080 makes sense as if the 3080 has such a reputation, 4080s will end up selling (price dependent) regardless of performance but due to reputation. Also the people who missed out last time who are determined not to this time.
 
Soldato
Joined
22 May 2010
Posts
8,300
Everything is getting more expensive, not entirely sure how GPUs are going to avoid it. Also, the dumbing down of the 4080 makes sense as if the 3080 has such a reputation, 4080s will end up selling (price dependent) regardless of performance but due to reputation. Also the people who missed out last time who are determined not to this time.
Will probably be more of an attraction to those that couldn't get hold of a decent 3000 series card, much like Turing when it came out and people with 1080ti's were less profound to jump since the 1080ti was such a big step up compared to previous gen.

Think i'd be more interested to see what AMD brings out this round as for the 2nd time it will be competing and competing well with Nvidia so hopefully that will level out prices abit especially if AMD decides to undercut them (which they normally do).
 
Caporegime
Joined
18 Oct 2002
Posts
26,480
Location
y0 Mommas a**
Begs the question, what are they going to charge for the 4080, 4080ti (inevitably to fill the big gap) and 4090. If by going on the rumours and leaks is true then charging £650+ for the 4080 will be an absolute rip off this round lol.
Don't forget that they can add value to a 4000-series card by simply restricting a DLSS 3.0 (or a similar feature) to that series, which let's face it being Nvidia then they probably will do something like that.
 
Top Bottom