LM Studio 0.4.0
Summary
LM Studio 0.4.0 introduces headless deployment via the llmster daemon, enabling the core of LM Studio to run as a standalone daemon on Linux, macOS, Windows, and cloud environments. The release adds parallel requests with continuous batching for high-throughput inference, a new stateful REST API endpoint /v1/chat for chatting with local models, and a refreshed UI with features like chat export, Split View, and in-app docs. It also adds permission keys for restricted access and expands deployment options for cloud, CI, and servers.