Vellium v0.3.5: Major Writing Mode Overhaul and Native KoboldCpp Support
1 min readVellium, a popular UI for local text generation, has released v0.3.5 with significant enhancements focused on making local providers more accessible. The update adds native KoboldCpp support, eliminating the need for workarounds and bridging tools, while the writing mode received a comprehensive overhaul including structured book bible functionality, direct DOCX import, and cached book summaries for better context management.
These improvements address practical pain points for users running local models. Native KoboldCpp integration simplifies setup and reduces troubleshooting friction, while enhanced writing features make Vellium competitive with cloud-based creative writing tools. The addition of OpenAI TTS integration also bridges the gap between open-source inference and commercial services where needed, giving users flexibility to mix and match tools.
For practitioners building local-first workflows—whether for creative writing, technical documentation, or code generation—these updates lower the barrier to adopting fully on-device pipelines. The focus on practical features like DOCX import and book context management suggests the developers understand real user needs beyond basic chat interfaces. This is exactly the kind of incremental refinement that makes local LLM deployment viable for non-technical users.
Source: r/LocalLLaMA · Relevance: 7/10