Follow

AI stuff 

I have set up a local LLM with a web UI on my laptop. It was pretty easy (for that I credit Arch/AUR). It barely runs but my laptop doesn't have a fan or discrete GPU so it's kind of amazing that it works at all.

(For the few things I use this for, I don't want to be there when OpenAI starts rent-seeking, and I'd rather not continue to inflate their numbers now.)

Sign in to participate in the conversation
Computer Fairies

Computer Fairies is a Mastodon instance that aims to be as queer, friendly and furry as possible. We welcome all kinds of computer fairies!