@rasbt
@TragicHero628 @ollama @OpenWebUI Ollama is my go to for running any existing LLMs listed on the ollama website, but I think if you want to add custom LLMs, you have to convert them yourself and they also have to follow a specific format (https://t.co/LbXWqrsoPI). So, I'd say it's a less general solution and more for the post-prototyping phase.