Hi all, I’d like to hear some suggestions on self hosting LLMs on a remote server, and accessing said LLM via a client app or a convenient website. Either hear about your setups or products you got good impression on.
I’ve hosted Ollama before but I don’t think it’s intented for remote use. On the other hand I’m not really an expert and maybe there’s other things to do like add-ons.
Thanks in advance!
No, but I have free instance on Oracle Cloud and that’s where I’ll run it. If it’s too slow or no good I’ll stop using it but there’s no harm trying.
I’d be interested to see how it goes. I’ve deployed Ollama plus Open WebUI on a few hosts and small models like Llama3.2 run adequately (at least as fast as I can read) on even an old i5-8500T with no GPU. Oracle Cloud free tier might work OK.
then I’ll let you know when I deploy it. didn’t do it yet, might do it today, maybe later.