This desktop app for hosting and running LLMs locally is rough in a few spots, but still useful right out of the box.
If you're just getting started with running local LLMs, it's likely that you've been eyeing or have opted for LM Studio and Ollama. These GUI-based tools are the defaults for a reason. They make ...