• trees_away@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I don’t think you know what you’re talking about. 4090s are just as fast at training as an A100. Faster at inference in some cases. They just don’t have as much VRAM. (24gb for 4090 vs 40gb/80gb for A100)