It's been four years since AMD launched RDNA, the successor to the venerable GCN graphics architecture. We take a look through the tech and numbers to see...
Per MLID’s recent podcast, AMD is actually ‘proud’ of their GPU division. They’re punching well above their weight, considering their R&D budget is (probably) not even a quarter of Nvidia’s.
It’s pretty impressive when you think about it.
Their ‘secondary’ GPU division is the reason AMD has the lion’s share of the console market, the sole exception being the Switch. Though that might change in the future, now that Intel is finally serious about GPUs.
I still don’t understand why they ditched Larabee way back when.
In any case, I personally don’t think Su is interested in graphics or ML, or at least she doesn’t sound too enthusiastic to me whenever she discusses Radeon. Her sole focus is - apparently - consumer CPUs and data centers.
Plus, I think Raja’s departure was a huge blow to Radeon, which ‘might’ explain why RDNA3 is so lackluster with absolute mediocre RT performance.
Their R&D budget is probably not a quarter of Nvidia’s and their profits probably aren’t even 1/10th, not sure how that’s a good ratio. Clown take from MLID (like always).
Per MLID’s recent podcast, AMD is actually ‘proud’ of their GPU division. They’re punching well above their weight, considering their R&D budget is (probably) not even a quarter of Nvidia’s.
Compared to AMD, Intel blew their budget on XeSS features no one asked for when their customers can just be using FSR, launched their top card first instead of their budget cards to have future Intel cards be associated with negative branding, and burned bridges with AIBs by not launching on time. That on top of losing $3.5 billion in value in the AXG group.
Intel should have launched their Arc 3 and mobile GPU cards their first year and after optimizing their drivers, start claiming their performance claims against an RTX 3070 when their transistor density is already exceeding a 6950XT with their A750 card.
Why in the world would Intel’s plan for upscaling to be “nothing, let AMD take the lead”.
The cost of entering this market was well known. “We lost money spending years developing a GPU generation” was a surprise to noone.
Arc’s primary goal is to deliver great iGPU performance in the mobile market, which has a 2x TAM vs desktop dGPU.
Raja oversaw the launch of an entirely new product into an existing market segment. And they did it during the toughest time out there. A pandemic with poor economic outcome. It wasn’t that the market didn’t take to the product. Just people are skittish about opening up their wallets today.
Intel essentially was one year too late and a year too early with their launch. If they had timed it earlier it would have been a home run during the pandemic fueled pc purchasing.
Arc launched with Ray Tracing, XeSS up scaling, and good performance per value. Their driver team is still working hard and launching home runs with each delivery. They only lack a high end GPU capable of 400 watt and frame generation technology.
Which to be absolutely fair to anybody, they aren’t expected to get right on the first product launch. NVIDIA is like on their 20 plus generation to this thing called a discrete graphics unit.
Per MLID’s recent podcast, AMD is actually ‘proud’ of their GPU division. They’re punching well above their weight, considering their R&D budget is (probably) not even a quarter of Nvidia’s.
It’s pretty impressive when you think about it.
Their ‘secondary’ GPU division is the reason AMD has the lion’s share of the console market, the sole exception being the Switch. Though that might change in the future, now that Intel is finally serious about GPUs.
I still don’t understand why they ditched Larabee way back when.
In any case, I personally don’t think Su is interested in graphics or ML, or at least she doesn’t sound too enthusiastic to me whenever she discusses Radeon. Her sole focus is - apparently - consumer CPUs and data centers.
Plus, I think Raja’s departure was a huge blow to Radeon, which ‘might’ explain why RDNA3 is so lackluster with absolute mediocre RT performance.
Their R&D budget is probably not a quarter of Nvidia’s and their profits probably aren’t even 1/10th, not sure how that’s a good ratio. Clown take from MLID (like always).
Compared to AMD, Intel blew their budget on XeSS features no one asked for when their customers can just be using FSR, launched their top card first instead of their budget cards to have future Intel cards be associated with negative branding, and burned bridges with AIBs by not launching on time. That on top of losing $3.5 billion in value in the AXG group.
Intel should have launched their Arc 3 and mobile GPU cards their first year and after optimizing their drivers, start claiming their performance claims against an RTX 3070 when their transistor density is already exceeding a 6950XT with their A750 card.
Why in the world would Intel’s plan for upscaling to be “nothing, let AMD take the lead”. The cost of entering this market was well known. “We lost money spending years developing a GPU generation” was a surprise to noone.
Arc’s primary goal is to deliver great iGPU performance in the mobile market, which has a 2x TAM vs desktop dGPU.
Yeah Raja didn’t do so hot for Intel, so I doubt the Radeon team suffered much with the loss…
Correct me if I’m wrong, but I believe Raja is a hardware guy. The issues Arc is facing are mostly software/API related. The hardware is sound.
Besides, Arc has far superior RT performance than Radeons, which is a pretty good indicator of what’s under the hood.
Raja oversaw the launch of an entirely new product into an existing market segment. And they did it during the toughest time out there. A pandemic with poor economic outcome. It wasn’t that the market didn’t take to the product. Just people are skittish about opening up their wallets today.
Intel essentially was one year too late and a year too early with their launch. If they had timed it earlier it would have been a home run during the pandemic fueled pc purchasing.
Arc launched with Ray Tracing, XeSS up scaling, and good performance per value. Their driver team is still working hard and launching home runs with each delivery. They only lack a high end GPU capable of 400 watt and frame generation technology.
Which to be absolutely fair to anybody, they aren’t expected to get right on the first product launch. NVIDIA is like on their 20 plus generation to this thing called a discrete graphics unit.
And a war that cause the relocation of the team responsible for the Arc drivers (in Russia) due to sanctions.
Can you elaborate more on this? This is the first time hearing about relocation for the driver team?
https://www.reddit.com/r/hardware/comments/x49nvq/intels_gpu_driver_development_was_disrupted_by/