

I didn’t know that. I thought just one ROCM binary to install, run Ollama and that’s it. Thanks for the explanation
I didn’t know that. I thought just one ROCM binary to install, run Ollama and that’s it. Thanks for the explanation
Do you have any recommendations for running the Mistral small model? I’m very interested in it alongside CodeLlama, OogaBooga and others
Wait how does that work? How is 24GB enough for a 38B model?
The 7900XTX was $1000 when it launched, I wouldn’t mind it used either.
I don’t mind multiple GPUs but my motherboard doesn’t have 2+ electrically connected X16 slots. I could build a new homeserver (I’ve been thinking about it) but consumer platforms simply don’t have the PCIE lanes for 2 actual x16 slots. I’d have to go back to Broadwell Xeons for that, which are really power hungry. Oh well, I don’t think it matters considering how power hungry GPUs are now.
I am OK with either Nvidia or AMD especially if Ollama supports it. With that said I have heard that AMD takes some manual effort whilst Nvidia is easier. Depends on how difficult ROCM is
Thank you. Are 14B models the biggest you can run comfortably?
Do you have 2 PCIE X16 slots on your motherboard (speaking in terms of electrical connections)?
Seedboxes go from €2 to €100+ a month depending on how much you will torrent and how much space you need on the box alongside other factors. My personal choices are Gigarapid and Ultra but there are others
True. I use a cheap computer monitor for myself, and since I live like a hermit it isn’t really a problem for me. Aren’t projectors more expensive than cheap 1080p TVs?
I will buy a Telly TV as long as I don’t have to give my ID and can desolder the WiFi chip.
I know what day it is but commenting on a hypothetical precedent
Well if I could get a twitter API to abuse for free, I’d compete with him to create automated shitposts. Not going to pay for it though
What ratio are you at with your Linux ISOs *wink.
Ooh XE Iaso’s blog. I watched a couple of her videos on YouTube. What a talented person
Just lol at blocking Cloudflare. They seriously think this will being down the number of pirates?
People have had bad experiences and the founder of Njalla hasn’t helped. There are other Domain registrars with a similar strategy to Njalla if you prefer them instead. One can’t deny that not having your name on a domain allows for one to host… err, sensitive stuff
Njalla is what you use if you don’t want your name plastered all over the domain for various reasons
Isn’t this only for people running NGINX?
I’m loving your commentary on the subject.
When I mentioned Mistral, I meant their attention to open source. Their new model (Mistral small) can be run on consumer hardware with similar results to ChatGPT if trained on good data. AI isn’t that useful outside of me asking it to write one-liners but I haven’t had the experience you have.
In general how much VRAM do I need for 14B and 24B models?