it would be crazy to take the foot off the gas in CPU/SOC side while intel is imploding. AMD inherits the x86 market by default. NVIDIA is on top of their game in a product segment that AMD has neglected their investments in for a long time. Yeah, do whatever the consoles want, ship the MVP of that as a dGPU and stick it in your iGPUs.
Nvidia is nearly 10x the size of AMD, and more focused on GPUs. That’s a lot of R&D, and if they keep outselling AMD 10:1 in GPUs that’s a big amortized development cost advantage.
That’s a big hill to climb, and (unlike Intel) they don’t seem to be sitting on their thumbs.
Intel used to be 20 times the size of AMD, and fully focused on CPUs.
I mean, companies have become lazy in the past. Maybe nvidia will just getting used to selling AI chips for $20k to companies and become lazy on the consumer origented GPU side, etc.
And remember: all of AMD’s R&D is between CPU, semi-custom, GPU, FPGA, memory control, etc. nVidia technically has a decent chunk of non-GPU-specific (tegra/SoC stuff like for the Switch, or high-speed networking interfaces) but still their GPU teams outnumber all of AMD’s R&D by almost 10x alone.
Intel had a similar scale advantage but they for some reason stagnated for (depending on how you count) 5-10 years which allowed AMD to dig themselves out of their hole. nVidia at least hasn’t been that incompetent, though they have stumbled quite a bit recently on their total inertia.
That’s where the focus comes in - AMD have CPUs, FPGAs, networking, consoles etc. all split between “About the same number of engineers”. Nvidia have GPUs, and a bit of Tegra SoCs on the side?
And having more money means you can pay other people to do stuff, be it contractors (so not in the “employee count”, or other companies like getting first dibs at TSMC. Though it seems Apple are the ones paying for #1 there right now. Nvidia outspend AMD in R&D in total even then. If they wanted to (and not get slapped down by antitrust laws pretty quickly) they could probably sell GPUs at a loss and just starve out AMD - hell people here will likely celebrate that as they’ll get cheaper GPUs from those loss leaders and miss the long-term ramifications.
And the moat is bigger than just internal engineering - if you’re a gamedev and you can choose a technique that works better on NVidia, or one that works better on AMD, you’ll choose the 90% of your market every time.
When Microsoft are asking around for things to do in the next generation directX, who do you think they’ll listen to more?
Instinct MI300(X) is THE hot shit to get for everyone who can’t get H100/GH100 because they’re in short supply.
AMD is a larger player than people realize, especially with AI frameworks supporting ROCm now. Nvidia got a head start, but Intel, AMD and a myriad of Arm/RiscV custom-chip-companies are not asleep.
Can they? Of course they can?
Will they? Unlikely.
Unless there will be moment we stop in development id except nvidia to remain step ahead.
it would be crazy to take the foot off the gas in CPU/SOC side while intel is imploding. AMD inherits the x86 market by default. NVIDIA is on top of their game in a product segment that AMD has neglected their investments in for a long time. Yeah, do whatever the consoles want, ship the MVP of that as a dGPU and stick it in your iGPUs.
Intel is imploding at the same rate as AMD is developing their consumer GPU raytracing capabilities. Very slowly, possibly not at all.
They have issues, their CPUs are outmatched in some areas by AMD and others. But Intel is not dead, they’re probably not even dying.
Nvidia is nearly 10x the size of AMD, and more focused on GPUs. That’s a lot of R&D, and if they keep outselling AMD 10:1 in GPUs that’s a big amortized development cost advantage.
That’s a big hill to climb, and (unlike Intel) they don’t seem to be sitting on their thumbs.
Intel used to be 20 times the size of AMD, and fully focused on CPUs.
I mean, companies have become lazy in the past. Maybe nvidia will just getting used to selling AI chips for $20k to companies and become lazy on the consumer origented GPU side, etc.
And remember: all of AMD’s R&D is between CPU, semi-custom, GPU, FPGA, memory control, etc. nVidia technically has a decent chunk of non-GPU-specific (tegra/SoC stuff like for the Switch, or high-speed networking interfaces) but still their GPU teams outnumber all of AMD’s R&D by almost 10x alone.
Intel had a similar scale advantage but they for some reason stagnated for (depending on how you count) 5-10 years which allowed AMD to dig themselves out of their hole. nVidia at least hasn’t been that incompetent, though they have stumbled quite a bit recently on their total inertia.
Both company’s have nearly the same number of employees. Do you have a source for Nvidia’s RnD teams being so much bigger company wide?
A meaningless statistic when AMD makes CPUs and GPUs and Nvidia almost exclusively only makes GPUs.
Nvidia and AMD have nearly the same number of employees. Using market cap as the comparison in this context seems rather silly.
Looking at the total number of employees without knowing exactly how they’re split across different divisions in the company seems rather silly.
AMD might have 10% of their total employees working on the GPU side of things, Nvidia might have 30%
Numbers lie a lot if applied willy-nilly without any context or frame of reference.
That’s where the focus comes in - AMD have CPUs, FPGAs, networking, consoles etc. all split between “About the same number of engineers”. Nvidia have GPUs, and a bit of Tegra SoCs on the side?
And having more money means you can pay other people to do stuff, be it contractors (so not in the “employee count”, or other companies like getting first dibs at TSMC. Though it seems Apple are the ones paying for #1 there right now. Nvidia outspend AMD in R&D in total even then. If they wanted to (and not get slapped down by antitrust laws pretty quickly) they could probably sell GPUs at a loss and just starve out AMD - hell people here will likely celebrate that as they’ll get cheaper GPUs from those loss leaders and miss the long-term ramifications.
And the moat is bigger than just internal engineering - if you’re a gamedev and you can choose a technique that works better on NVidia, or one that works better on AMD, you’ll choose the 90% of your market every time.
When Microsoft are asking around for things to do in the next generation directX, who do you think they’ll listen to more?
Worth noting that Nvidia’s latest AI push is very dependent on super high end in house networking.
Everything would make a lot more sense to gamers if they understood data center is the priority.
What presence does Radeon Instinct have in the traditional data centre then?
Instinct MI300(X) is THE hot shit to get for everyone who can’t get H100/GH100 because they’re in short supply.
AMD is a larger player than people realize, especially with AI frameworks supporting ROCm now. Nvidia got a head start, but Intel, AMD and a myriad of Arm/RiscV custom-chip-companies are not asleep.