HomeToolsMCPHow It WorksStoriesPhilosophyCommunityArchitectureStar on GitHub
All Tools
T
LLMFreeOpen Source

TEXTGEN

Desktop app for local LLMs with chat, vision, and tool-calling

AGPL-3.0

ABOUT

Running local LLMs requires complex setup of backends, model loaders, and API servers. TextGen provides a unified desktop application with multiple backends (llama.cpp, ExLlamaV3, Transformers), chat modes, vision support, tool-calling, and an OpenAI-compatible API, all without sending data to external servers.

INSTALL
curl -L -o textgen.zip https://github.com/oobabooga/textgen/releases/latest/download/textgen.zip unzip textgen.zip

INTEGRATION GUIDE

1. Run local LLMs for private conversations without cloud dependency 2. Deploy an OpenAI-compatible API endpoint for local model serving 3. Use vision models for image understanding in multimodal chat 4. Fine-tune LoRAs on custom datasets with the built-in training tab

TAGS

pythonllmlocaldesktopapivisiontool-calling
TextGen — AI Tool | Agentic AI For Good