People say it’s due to “blow back” but Nvidia give maybe half a fk ab that. The real reason is to not eat into their AI/ML workstation cards.
Giving the weak 4060ti large VRAM is perfect for that. Students and AI/ML curious ppl can get it (and get into Nvidia ecosystem later) to tinker, but they cant make serious money from the card.
Cards like 70/ti have enough juice to act as budget AI/ML machine aren’t viable due to VRAM. Yes this gimps them in gaming. BUT! Not to the point of outright losing to the competitors as 12GBs is still in the “still work” zone.
Ppl who want to make serious money need to fork out 1 grand or higher for a card with both horsepower AND VRAM like 4080/90, H100.
Anyone who is serious about this for a living isn’t going to take a huge chance to save a few hundred bucks. It’s not worth the headaches of dealing with potential scams, no manufacturers warranty, and questionable condition of the card.
Plenty of serious people are buying used 3090’s, in no small part because they can nvlink together, and because their pricing is determined more by the mainstream 1440p gaming market than by the ai market.
It’s not a few hundred bucks, it can easily break past a thousand saved on a couple of cards. Where old gaming cards aren’t being bought, old enterprise and workstation cards are being bought. A “serious” companies is going to use H100s/A100s for the most part which is in a different stratosphere of pricing.
People working in IT have been buying used gear for absolute eons. People on tight budgets like students can’t always just snap their fingers and go “I don’t care about a few hundred bucks, I am SERIOUS”. A serious setup can easily run a hundred grand. Single/dual 3090 setups are very competitive at the price point.
Also not everybody is serious and really just wants an AI rig that can have pretty good performance at drawing pictures for stable diffusion for the price and doesn’t intend to cure cancer. 3090’s are pretty good on the used market except miners have been generally destroying the vram of these cards because crypto mining was memory bandwidth bound.
It IS a few hundred bucks because the competitor for a used 3090 isn’t H100/A100… it’s a new 4090… And I think you know this, which makes your whole comment just really weird and oddly confrontational for no reason. A $1000 refurbished 3090 vs $1700 4090 isn’t an easy no brainer decision.
But… I’m going to assume you just misunderstood me rather than argue with me for sake of arguing.
A used 3090 can be about $700, however, that’s typically from hardwareswap or some other marketplace (fb, craigslist, eBay) that will typically not come with any warranty unless the previous owner still has manufacturers warranty remaining. And buying a used 3090 from “trust me bro” reddit user “PM_UR_CAT_TITSz” probably isn’t going to fly in an actual professional setting, I think your boss would be pretty upset they they spent $700 on a paperweight because “PM_UR_CAT_TITSz” sold you a cracked PCB.
Actual refurbished 3090s are probably going to go for a bit more than $700. A quick cursory search makes it look closer to $900-1000. A 4090 (which will also be MUCH faster) is about $1700.
Typically people who go for A100/H100 are getting it for the official support for studio/data center drivers and or/they need MUCH more memory.
People working in IT have been buying used gear for absolute eons. People on tight budgets like students can’t always just snap their fingers and go “I don’t care about a few hundred bucks, I am SERIOUS”. A serious setup can easily run a hundred grand. Single/dual 3090 setups are very competitive at the price point.
And when your DIY “hack” to get data center drivers on your cheap 3090’s ends up fucking you up or goes against some sort of ToS/licensing agreement with potential legal implications, I doubt your boss is going to be too happy about this. This isn’t like buying a cheap used xeon processor and supermicro motherboard for a NAS server. And if your application doesn’t require those special drivers, etc… Again, a 4090 is also pretty compelling.
Also not everybody is serious and really just wants an AI rig that can have pretty good performance at drawing pictures for stable diffusion for the price and doesn’t intend to cure cancer. 3090’s are pretty good on the used market except miners have been generally destroying the vram of these cards because crypto mining was memory bandwidth bound and the vram on these cards run hot.
Uhm… what does this even have to do with my comment? I made it pretty clear I was talking about setups that were more “mission critical” with more financial implications where your business depends on it. Yeah, if you want to run stable diffusion to make your favorite anime waifu in a hyper realistic art style, nothing wrong with that, you can use any hardware you want with that given it has enough VRAM and can run the software.
Nvidia gives zero shits about “gamers blow back”. As long as Nvidia keeps up the momentum and has AMD always chasing them and playing catch-up with features they don’t have to worry about “gamer blow back”. Nvidia game works/hair works, VRR, ai upscaling, ray tracing, frame generation, etc. AMD is on a perpetual treadmill.
They know they have a near monopoly on PC sales (discreet, laptop, prebuilts, etc).
It’s still a decent revenue stream for them, but it doesn’t matter what they do because most PCs use Nvidia hardware.
This is 100% a way to artificially segment their products and get professionals to spend huge amounts of money in order to get adequate amounts of VRAM.in reality, VRAM isn’t a huge extra cost.
People say it’s due to “blow back” but Nvidia give maybe half a fk ab that. The real reason is to not eat into their AI/ML workstation cards.
Giving the weak 4060ti large VRAM is perfect for that. Students and AI/ML curious ppl can get it (and get into Nvidia ecosystem later) to tinker, but they cant make serious money from the card.
Cards like 70/ti have enough juice to act as budget AI/ML machine aren’t viable due to VRAM. Yes this gimps them in gaming. BUT! Not to the point of outright losing to the competitors as 12GBs is still in the “still work” zone.
Ppl who want to make serious money need to fork out 1 grand or higher for a card with both horsepower AND VRAM like 4080/90, H100.
can’t you also get a used 3090
Anyone who is serious about this for a living isn’t going to take a huge chance to save a few hundred bucks. It’s not worth the headaches of dealing with potential scams, no manufacturers warranty, and questionable condition of the card.
Plenty of serious people are buying used 3090’s, in no small part because they can nvlink together, and because their pricing is determined more by the mainstream 1440p gaming market than by the ai market.
It’s not a few hundred bucks, it can easily break past a thousand saved on a couple of cards. Where old gaming cards aren’t being bought, old enterprise and workstation cards are being bought. A “serious” companies is going to use H100s/A100s for the most part which is in a different stratosphere of pricing.
People working in IT have been buying used gear for absolute eons. People on tight budgets like students can’t always just snap their fingers and go “I don’t care about a few hundred bucks, I am SERIOUS”. A serious setup can easily run a hundred grand. Single/dual 3090 setups are very competitive at the price point.
Also not everybody is serious and really just wants an AI rig that can have pretty good performance at drawing pictures for stable diffusion for the price and doesn’t intend to cure cancer. 3090’s are pretty good on the used market except miners have been generally destroying the vram of these cards because crypto mining was memory bandwidth bound.
It IS a few hundred bucks because the competitor for a used 3090 isn’t H100/A100… it’s a new 4090… And I think you know this, which makes your whole comment just really weird and oddly confrontational for no reason. A $1000 refurbished 3090 vs $1700 4090 isn’t an easy no brainer decision.
But… I’m going to assume you just misunderstood me rather than argue with me for sake of arguing.
A used 3090 can be about $700, however, that’s typically from hardwareswap or some other marketplace (fb, craigslist, eBay) that will typically not come with any warranty unless the previous owner still has manufacturers warranty remaining. And buying a used 3090 from “trust me bro” reddit user “PM_UR_CAT_TITSz” probably isn’t going to fly in an actual professional setting, I think your boss would be pretty upset they they spent $700 on a paperweight because “PM_UR_CAT_TITSz” sold you a cracked PCB.
Actual refurbished 3090s are probably going to go for a bit more than $700. A quick cursory search makes it look closer to $900-1000. A 4090 (which will also be MUCH faster) is about $1700.
Typically people who go for A100/H100 are getting it for the official support for studio/data center drivers and or/they need MUCH more memory.
And when your DIY “hack” to get data center drivers on your cheap 3090’s ends up fucking you up or goes against some sort of ToS/licensing agreement with potential legal implications, I doubt your boss is going to be too happy about this. This isn’t like buying a cheap used xeon processor and supermicro motherboard for a NAS server. And if your application doesn’t require those special drivers, etc… Again, a 4090 is also pretty compelling.
Uhm… what does this even have to do with my comment? I made it pretty clear I was talking about setups that were more “mission critical” with more financial implications where your business depends on it. Yeah, if you want to run stable diffusion to make your favorite anime waifu in a hyper realistic art style, nothing wrong with that, you can use any hardware you want with that given it has enough VRAM and can run the software.
Nvidia gives zero shits about “gamers blow back”. As long as Nvidia keeps up the momentum and has AMD always chasing them and playing catch-up with features they don’t have to worry about “gamer blow back”. Nvidia game works/hair works, VRR, ai upscaling, ray tracing, frame generation, etc. AMD is on a perpetual treadmill.
They know they have a near monopoly on PC sales (discreet, laptop, prebuilts, etc).
It’s still a decent revenue stream for them, but it doesn’t matter what they do because most PCs use Nvidia hardware.
This is 100% a way to artificially segment their products and get professionals to spend huge amounts of money in order to get adequate amounts of VRAM.in reality, VRAM isn’t a huge extra cost.