wuphysics87@lemmy.ml to Privacy@lemmy.ml · 3 months agoCan you trust locally run LLMs?message-squaremessage-square20fedilinkarrow-up172arrow-down16file-text
arrow-up166arrow-down1message-squareCan you trust locally run LLMs?wuphysics87@lemmy.ml to Privacy@lemmy.ml · 3 months agomessage-square20fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squarestink@lemmygrad.mllinkfedilinkEnglisharrow-up1·3 months agoIt’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.
minus-squarestink@lemmygrad.mllinkfedilinkEnglisharrow-up1·3 months agoAt parents for the week but IIRC I had some sort of incompatible extension with a model I downloaded off huggingface
deleted by creator
It’s nice but it’s hard to load unsupported models sadly. Really wish you could easily sideload but it’s nice unless you have a niche usecase.
deleted by creator
At parents for the week but IIRC I had some sort of incompatible extension with a model I downloaded off huggingface