AMD RDNA4 in LLVM project The next-generation gaming graphics architecture known as GFX12 or RDNA4 is being prepared by AMD. It appears that AMD has now decided which GPUs from the RDNA4 series will be the first to deploy. The GFX1200 and GFX1201 are two unnamed graphics processors which have now been spotted in Linux […]
So the leakers were correct.
Means the top end Blackwell offerings will cost an arm and a leg. F for the consumers. If Nvidia even puts out a 102-die as 5090 given the run on AI and just how insane margins are there. So if they are limited by fab capacity they might just pull another 4070 and sell us a 103 die as 5090 and force it down our throat.
Where did you read that on that article?
It’s not verbatim confirmed, but it does match the leaks to the degree that, instead of a halo Navi 41/N4C die for the top end 8900 XTX (not that they still don’t call another chip this name), they will launch two different, monolithic dies, but quicker and closer to each other, time-wise.
Since this is such a stark change of release cadence compared to N21 and N31, this points to the leaked release strategy being correct.
How? Just because there’s not a gfx1200 target?
There were the same leaks before RDNA 2, showing that AMD wouldn’t do better than a 2080ti and we still had a 6900xt… So i’m waiting for the official release to get an idea.
I’m fine with it. Consumers reap what they sow, basically. AMD is likely gonna drop high end GPUs in general, if not dedicated GPUs completely.
I doubt this. They just might drop TSMC for them and go with Samsung. TSMC is way too expensive for big dies given the reluctance people have to pay similar prices to Nvidia for similar performance from AMD. At the end of the day, AMD doesn’t have a significant edge from a cost perspective. Chiplet benefits are cool, but AMD needs an interposer so the cost advantage might not be as impressive and the 6nm dies might not be as cheap given that they’re still manufactured on TSMC.
So TLDR: I think they might shift their focused for the higher end market, but I doubt they will entirely abandon it. They might just take a break from it.
Which is hilarious because 7900 XT(X) is the highest selling high end GPU they’ve ever made afaik
That’s not backed by AMD financials or marketshare
To be fair “highest selling high end GPU ever” for AMD is still not a lot compared to NV or their own midrange stuff.
Sorry, I misread. I thought you said highest selling GPU which is what I have also read elsewhere.
Seems to me 7800XT is their best performer but not sure
From the last few gens? Not even close to best. Certainly not more than Radeon 9000 series, HD4000 series, HD5000 series, HD7000 series, and R9 200 series.
7900 XTX, one year later is at 0.19% on Steam.
A year after 6900 XT released, 6900 series was at 1.19%.
Given their last high end before 6900 was 390X, which there is no steam hardware survey on, but 7970 was ahead… Yeah, not even close.
Their last “high end” before the RX 6900 XT was the Vega VII and before that Vega 64, yeah they only were able to compete with the RTX 2080 and GTX 1080 respectively but so does the RX 7900 XTX that can only compete with the RTX 4080…
PS.: Also the high end AMD GPU before Vega 64 was the R9 Fury X (R9 390X was a 290X refresh that launched in the same period), that was quite competitive with the GTX 980 Ti but it’s 4 GB of HBM and and the necessity to be water cooled limited its sales…
The Fury X was an instant no-buy for high-end 4K gamers, due to the measly 4GB of VRAM.
Just as the RTX 4080 should be a no-buy for high-end 4K gamers, due to the measly 16GB of VRAM. In a year’s time, AAA RT-enabled games will suck up >16GB at 4K.
Surely the 4080 might not age very well but it’s very likely it’s RT performance will be insufficient before VRAM becomes an issue, even Alan Wake II limits its use of path tracing at max settings and still uses a decent amount of raster.
forgot about vega’s existence tbh
Sadly they weren’t that impactful besides the Vega 56 - competed very well with the GTX 1070 and Nvidia launched the GTX 1070 Ti because of it - they consumed too much power at stock because of overvoltage and they launched way too late…
Yep, they were honestly decent cards, but wrong time to launch. Same issue nVidia had with 400 series, without the whole “overpriced to fuck, trying to scam customers” kinda deal they had with the benchmarking requirements for reviewers.
loved my Vega 56, performed well, and a little undervolting fixed the power issue big time… didn’t feel like I were missing out for “not buying Nvidia”
It really was a good GPU, sadly the 1st impression Vega gave wasn’t good stacked with being one year late, overvolted and barely could reach the GTX 1080 at launch…
I would’ve gotten one if they weren’t quite uncommon in my country, even Navi was a lot more easy to find in the used market so I ended up getting a RX 5700 that’s serving me very well!
Looks sadly at radeon VII by my feet
Again with the steam numbers it’s not accurate as the data is gather from a pool of people who opts in to the survey that pool could be 500/5000 people we wouldn’t known
Steam is also heavily biased towards Nvidia users. I’d like to see stats which discount China, which is flooded with Nvidia GPUs, especially in their internet cafes. The other issue is that Steam seems to count the same cafe PC twice, if two survey opted-in gamers log onto that same PC.
Navi21 wasn’t high end, GA102 was just weak, also 6900 XT was not at 1.19% in late 2021,that’s 🧢
A 520mm^2 die on 7nm is pretty high end
That sounds plausible, but only because the total addressable market for GPUs is so much bigger now.
The real measure is the ratio of 7900 XTX to RTX 4080 and also the 4090.
I’m pretty sure the 4090 is outselling the 7900 XTX by something like 20:1…
I mean, we can debate “high-end”. By RDNA 5, we should have 4k @ 120 fps as a base-line for all dedicated GPUs. Where do you go after that in consumer GPUs?
While there will always be a small, enthusiast market for super-high end GPUs, I’m not sure the mainstream will be interested in pushing 240 FPS. Maybe Nvida sees the writing on the wall, which is why they’re pivoting away from consumer-focused GPUs.
And if AMD continues to serve us solid 300-600$ dGPUs until then, I think that’s still a win. I don’t think the market for >1000$ dGPUs is that large anyway.
Yes, but also this is why NVidia pushes raytracing.
I mean, all of that assumes requirements won’t keep increasing. Raytracing just artificially increases the performance requirements once you start getting to the top of what’s possible. The same will be done once RT is getting capped out.
Remember the claims from MLID/RGT/etc. when AMD release a halo desktop RDNA4 GPU…that’s unless they delete those particular videos.