n7gifmdn@lemmy.ca to Memes@lemmy.mlEnglish · 6 months agoMeta AIimage.nostr.buildimagemessage-square23fedilinkarrow-up11.11Karrow-down110
arrow-up11.1Karrow-down1imageMeta AIimage.nostr.buildn7gifmdn@lemmy.ca to Memes@lemmy.mlEnglish · 6 months agomessage-square23fedilink
minus-squarem-p{3}@lemmy.calinkfedilinkarrow-up55arrow-down4·edit-26 months agoThe quantized model you can run locally works decently and they can’t read any of it, which is nice. I use that one specifically https://huggingface.co/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/blob/main/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf If you’re looking for a relatively user-friendly software to use it, you can look at GPT4All (open source) or LM Studio.
minus-squarepassepartout@feddit.delinkfedilinkarrow-up16·6 months agoIf you’re ready to tinker a bit i can recommend Ollama for the backend and Open web UI for the frontend. They can also both run on the same machine. The advantage is that you can use your GPU to compute, which is a lot faster.
minus-squarenialv7@lemmy.worldlinkfedilinkarrow-up12·6 months agoPretty sure LM studio is not open source
minus-squarem-p{3}@lemmy.calinkfedilinkarrow-up7·6 months agoYou’re right, I thought they were but I checked their GitHub and LM Studio itself isn’t.
The quantized model you can run locally works decently and they can’t read any of it, which is nice.
I use that one specifically https://huggingface.co/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF/blob/main/Meta-Llama-3-8B-Instruct-Q4_K_M.gguf
If you’re looking for a relatively user-friendly software to use it, you can look at GPT4All (open source) or LM Studio.
If you’re ready to tinker a bit i can recommend Ollama for the backend and Open web UI for the frontend. They can also both run on the same machine.
The advantage is that you can use your GPU to compute, which is a lot faster.
Pretty sure LM studio is not open source
You’re right, I thought they were but I checked their GitHub and LM Studio itself isn’t.