• 0 Posts
  • 1 Comment
Joined 11 months ago
cake
Cake day: October 28th, 2023

help-circle
  • Best cheapest option to run smaller ai models on?

    Like the gguf’d Mistral 7b versions that are lighter on memory, for example. I need fast inference and I don’t really feel like depending on OpenAI, or paying them a bunch of money. I’ve fucked up and spent like $200 on api charges before, so definitely trying to avoid that.

    I have a 980ti and it’s just too damn old. It works with some stuff but it’s super hit or miss with any of the newer libraries.