Bad take. I am an AMD fan and arguably Intel has improved extraordinarily on this front. Gone are the days of the Extreme Edition and even their consumer flagships are finally priced very competitively against AMD. I would replace the Intel Inside head with NGREEDIA, quite honestly!
imo people are over exaggerating. let’s take the 4070 ti as an example. an $800 gpu that can be compared to a 3090 ti. that is overpriced to you? an $800 gpu that fast? let’s compare to the gtx 1080 ti, or the best nvidia gpu. the 1080ti launched at $700 around 6 years ago. 4070 ti at $800 in 2023. for those $100 more in 6 years you get the performance of an enthusiast gpu in 2023, and you still complain? over double the performance that’s limited by physical factors like not being able to forever decrease transistor size without increasing cost, silicon shortage and thermals is not enough? the 12gb of vram for really heavy workloads like video editing is still just fine if you’re working at a reasonable resolution, and for games, the devs should stop increasing texture size to get an unnoticeable increase in graphics quality.
also i don’t think anything over a 4060 ti/4070 should be used purely for gaming, like bro you’re spending over $1000 on a toy, if you have something faster at least use it to make money or learn a new skill (like work in 3d software, ai training, video editing etc). any gpu over $600 should be mainly targeted at creative people, for who the price tag isn’t a problem since they can easily repay it using exactly the gpu they invested in
To be honest though, AMD is really trying either. They just undercut Nvidia by 5-10% and call it a day. Gone are the days where they would undercut by 30-50% and force Nvidia to up their game. Both companies used to target a flagship price point of $699 and now we routinely have GPUs in the $1000-$2000 price range.
Nope, there’s a lot of newer games that nearing that much usage. And also, don’t you ignore the modded games communities like Skyrim and Cities Skylines. Those games can easily exceed 12GB usage without even trying too hard.
You’re living in 2015 if you think it’s only 3D modeling and video editing workloads that would actually utilize that much VRAM.
Bad take. I am an AMD fan and arguably Intel has improved extraordinarily on this front. Gone are the days of the Extreme Edition and even their consumer flagships are finally priced very competitively against AMD. I would replace the Intel Inside head with NGREEDIA, quite honestly!
imo people are over exaggerating. let’s take the 4070 ti as an example. an $800 gpu that can be compared to a 3090 ti. that is overpriced to you? an $800 gpu that fast? let’s compare to the gtx 1080 ti, or the best nvidia gpu. the 1080ti launched at $700 around 6 years ago. 4070 ti at $800 in 2023. for those $100 more in 6 years you get the performance of an enthusiast gpu in 2023, and you still complain? over double the performance that’s limited by physical factors like not being able to forever decrease transistor size without increasing cost, silicon shortage and thermals is not enough? the 12gb of vram for really heavy workloads like video editing is still just fine if you’re working at a reasonable resolution, and for games, the devs should stop increasing texture size to get an unnoticeable increase in graphics quality.
also i don’t think anything over a 4060 ti/4070 should be used purely for gaming, like bro you’re spending over $1000 on a toy, if you have something faster at least use it to make money or learn a new skill (like work in 3d software, ai training, video editing etc). any gpu over $600 should be mainly targeted at creative people, for who the price tag isn’t a problem since they can easily repay it using exactly the gpu they invested in
$800 is a lot if you have a family to feed.
or if you live outside the US.
And to what? Have some shinier and more reflective visuals in games a few hours per week?
I live outside of the US, and for me, an RTX 4070TI is nearly the same price as an RX 7900XTX. So yeah, pc prices are ridiculously compared to the US.
And also considering amd has better performance at that price, really no reason to buy nvidia only if you want a 4090 or rtx
To be honest though, AMD is really trying either. They just undercut Nvidia by 5-10% and call it a day. Gone are the days where they would undercut by 30-50% and force Nvidia to up their game. Both companies used to target a flagship price point of $699 and now we routinely have GPUs in the $1000-$2000 price range.
Where I come from, that’s called overpriced. ‘Not trying’ is giving AMD kiddie gloves while dumping the blame entirely on Nvidia
More like physically accurate lighting. But ignorance is bliss I guess.
Nope, there’s a lot of newer games that nearing that much usage. And also, don’t you ignore the modded games communities like Skyrim and Cities Skylines. Those games can easily exceed 12GB usage without even trying too hard.
You’re living in 2015 if you think it’s only 3D modeling and video editing workloads that would actually utilize that much VRAM.
$6k “HEDT” threadrippers and sapphire rapids have entered the chat. Gone are the days of $1k or even $2k Extreme Edition, indeed.
xeon is a figment of your imagination