Linux Desktop Support
OpenYak now runs natively on Linux with automatic Wayland/X11 detection and GBM buffer compatibility. The CI pipeline produces both .deb and .rpm packages, bringing full desktop support to the Linux ecosystem.
Custom OpenAI-Compatible Endpoints
Add and manage your own OpenAI-compatible LLM endpoints directly from the settings panel. Full CRUD API with SSRF validation, prefix-based model filtering, and i18n support for English and Chinese.
Performance & Cost Optimization
Prompt caching splits system prompts into cached/dynamic parts for Anthropic models, reducing cost on repeated turns. Zero-LLM-cost context collapse drops oldest messages before full compaction. Streaming tool concurrency executes read-only tools in parallel during LLM streaming. Microcompact context compression replaces old tool outputs with lightweight stubs under a 100K token budget.
Stability & Fixes
SSL certificate verification fix using certifi CA bundle resolves CERTIFICATE_VERIFY_FAILED on macOS. Web search guardrails cap native searches per step to control token usage. Resilient retry with reactive compaction handles context overflow and 529 overload scenarios with exponential backoff.