PC Pc Gaming/building thread

Remove this Banner Ad

I'm convinced the MSI 5080 box photo that's going around is a fake (even considering image upscaling the text looks AI), but when I see anything from the known Kopite and Chiphell forum accounts I pay a bit more attention.

Since RTX 50 is on the same node as 40 I don't think we can expect any massive generational uplifts so I guess that's why it makes sense that they're going to pump more power into them. I'm still going to wait until 5090 is released to make my decision. The previous official Galax 5090 leak that was taken down had "neural rendering" as a bullet point but that's missing from the (IMO fake) MSI 5080 box. If that's true that's all the more reason that together with 32gb of vram that while it might be the best gaming card it's also more for prosumer and AI use cases. For once price is absolutely no consideration but if the 5080 can perform considerably better than a 4090 (I'd love 30% but on the same node I'm expecting around 10-20%) I'd rather grab that than a prosumer AI card that wants 575w of juice. That's even despite the 5080 not looking like it packs the specs of what an 80 class card should have.

All will be revealed in a few days.

 

Log in to remove this ad.

I bought all my parts aside from waiting for the GPU now. 9800x3d and the motherboard are waiting for stock but orders should ship by 13th Jan anyway. I'll probably just be going for the rtx5080...I mean it's gonna be a massive upgrade on my old PC anyway and I'm not planning to upgrade from 1440p 165hz monitor any time soon at least

I could afford the 5090 I just dunno if I wanna spend that much $$$...especially without a 4k monitor. I ordered a 1000w atx 3.1 PSU which I guess could handle the 5090 most likely but ehh what's it gonna be, like $3.5k?
 
If pricing is to be believed then expect 5090 to be $4-4.5k. I'm the same where money isn't an issue and even bought a 1200w ATX 3.1 knowing that since they're not upgrading the node they'll just feed it more power than 4000 it will have a higher power requirement (and to future proof any future RTX 6000 purchase) but the more I look at it the more I don't want to spend it. 5080 may launch at the same price of the 4080 so expect low to mid $2ks. Sure the 4090 is better than the 4080 but besides Alan Wake 2 where a 4090 would have been nice there hasn't been anything I wasn't able to run at the resolution and settings I want. 5000 is going to come with better RT performance so that will pick up a lot of the improvement (along with the rumoured DLSS4) and I don't care about the 16gb VRAM as that's on the 4080 and I haven't had any issues with it at 3440x1440. Those complaining mostly about 16gb VRAM play at 4k with no upscaling and that's the only reason I can think of that the 5090 would be more beneficial for gaming.

I'm constantly flip flopping but at this stage I'm leaning back towards the 5080 and if in 12 months the 5080ti is released on the GB202 with 24gb of GDDR7 as a cut down 5090 opposed to the 5080 which is a full GB203-400 and for some reason VRAM suddenly becomes an issue then I'll consider it then.
 
AMD pulling an NVIDIA and fans/owners not happy


It's hilarious watching morons get pissed over this, as though AMD was some weird hardware messiah that actually gives a shit about their opinions and has betrayed them, instead of just being another multi billion dollar company looking to make money.

At least they can also be mad at Nvidia as it seems as though 16GB is confirmed for the 5080
1736194668949.png
 
What sort of impact will 16GB RAM in the 5080 have in the real world?

Assuming the 5090 will be out of reach for 99% of gamers, will developers, outside of a handful like CD Projekt who like pushing things to extremes (in which case a 5090 won't be enough to crank Witcher 4 settings to their maximum anyway...), really target 32GB?
 
What sort of impact will 16GB RAM in the 5080 have in the real world?

Assuming the 5090 will be out of reach for 99% of gamers, will developers, outside of a handful like CD Projekt who like pushing things to extremes (in which case a 5090 won't be enough to crank Witcher 4 settings to their maximum anyway...), really target 32GB?
It'll only be speculation until real world numbers come out (around Jan 21st apparently). I imagine it'll be dependent on the res you're playing at, 4K gaming may be tough down the track, 1440p should be ok. There could be some trickery happening with their "AI" stuff and other systems that could help cover the amount of RAM used, along with the speed boost going to GDDR7
 
16gb will be fine for now. If you dig into the complaints you'll find people who want to play at 4K without upscaling and frame gen. 10gb on the 3080 was genuinely low but that has only become an issue with some AAA games of the last 12 months so I'd expect 16gb would suffice 1440p for longer. We won't know how much the faster memory will compensate just yet either. IMO the real complaints come from how much this 80 is likely to be cut down from the 90 compared to previous gens. Many don't want to spend what the 5090 is going to cost so they're complaining that the 5080 isn't going to be what they want so the complaints snowball from there. The 5080 is still going to be poor value to performance but at the end of the day it's still going to be the second best graphics card you can buy. A more "future proof" 5080 Super or TI will certainly arrive in the next 12 months with more memory if that's going to be an issue for anyone.
 
I really like the cooler designs that Radeons have been getting the last few generations. Some of those 9070xt designs look really nice. The closest you can get with any NVIDIA is a Zotac. Assuming NVIDIA AIBs stick to their previous designs I'll probably be looking at a Zotac Super Trinity or MSI Gaming Trio.

What's funny though is AMD waiting for NVIDIA to announce their pricing before doing so themselves.
 

(Log in to remove this ad.)

5080 less than I thought, 5090 less than I thought but more than I hoped.

$1999USD is about $3200AUD, add the extra that it will cost for an AIB, the extra charges we pay for tech here plus GST and probably around $3700 up to $4000 for something like an ASUS. Less than I thought but I think the 5080 becomes even more tasty at that $999US. 5070 alleged to be 4090 performance is probably conditional with DLSS but it supports my hopes that the 5080 will be about +20% than the 4090. Hopefully +10% in raster.
 
5080 less than I thought, 5090 less than I thought but more than I hoped.

$1999USD is about $3200AUD, add the extra that it will cost for an AIB, the extra charges we pay for tech here plus GST and probably around $3700 up to $4000 for something like an ASUS. Less than I thought but I think the 5080 becomes even more tasty at that $999US. 5070 alleged to be 4090 performance is probably conditional with DLSS but it supports my hopes that the 5080 will be about +20% than the 4090. Hopefully +10% in raster.
Yeah clearly their numbers are based on using it's AI features to it's absolute limit. Definitely interested in some real benchmarks on the 5070ti and 5080 in a couple of weeks.
 
So weird Jensen didn't mention memory or show a single slide showing performance comparison. Not even a date. Maybe he didn't feel necessary because everything has been already leaked but not even a single mention.

Keen to learn more because the 5080 for that price is just about a lock for me and pending reviews it looks like I'll be stalking Scorptec at whatever time sales go live.
 
So weird Jensen didn't mention memory or show a single slide showing performance comparison. Not even a date. Maybe he didn't feel necessary because everything has been already leaked but not even a single mention.

Keen to learn more because the 5080 for that price is just about a lock for me and pending reviews it looks like I'll be stalking Scorptec at whatever time sales go live.
I guess he couldn't think of ways to say "AI" enough times while talking about them.

Nvidia just stated Jan 30 release date

 

Remove this Banner Ad

PC Pc Gaming/building thread

Remove this Banner Ad

Back
Top