AI General Thread

Started by Legend, Dec 05, 2022, 04:35 AM

0 Members and 2 Guests are viewing this topic.

Legend

Always funny how they try to save face. "We're both right!"

Quote from: the-Pi-guy on Jan 24, 2026, 09:56 PMhttps://ollama.com/huihui_ai/mistral-small-abliterated

This one is a 14 GB model. 

My system only has 10 GB VRAM, so it uses some system RAM. 


Oh, way too much. I need something lightweight. Don't know what I am doing.

the-pi-guy

Quote from: Legend on Today at 12:48 AMAlways funny how they try to save face. "We're both right!"


Oh, way too much. I need something lightweight. Don't know what I am doing.
How much VRAM you got? 

And how much RAM? 

Legend

Quote from: the-Pi-guy on Today at 12:55 AMHow much VRAM you got?

And how much RAM?
I'm exploring using an llm locally inside a videogame, so not much  :P


Probably not worth the hassle.