TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-21 year agoWhat's the bang for the buck go to setup for AI image generation and LLM models?message-squaremessage-square4fedilinkarrow-up130arrow-down111file-text
arrow-up119arrow-down1message-squareWhat's the bang for the buck go to setup for AI image generation and LLM models?TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-21 year agomessage-square4fedilinkfile-text
minus-squarekata1yst@sh.itjust.workslinkfedilinkEnglisharrow-up4·edit-21 year agoKobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs. I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
KobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs.
I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.