Hi all, I’d like to hear some suggestions on self hosting LLMs on a remote server, and accessing said LLM via a client app or a convenient website. Either hear about your setups or products you got good impression on.

I’ve hosted Ollama before but I don’t think it’s intented for remote use. On the other hand I’m not really an expert and maybe there’s other things to do like add-ons.

Thanks in advance!

    • ddh@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 days ago

      Running an LLM can certainly be an on-demand service. Apart from training, which I don’t think we are discussing, GPU compute is only used while responding to prompts.