Skip to main content

Changelog

Every improvement, automatically tracked from our commit history.

Subscribe via Atom feed
← Prev Page 57 of 117 Next →
February 21, 2026
patch Desktop Shell

Fix user messages excluded from conversation history

Details

User message ViewModels defaulted to ChatMessageState.Loading (enum 0)

since State was never explicitly set. The history filter excluded all

Loading messages, which silently dropped every user turn. Changed filter

to only exclude assistant messages that are still loading.

patch Desktop Shell

Fix AI API keys lost on restart by auto-unlocking ai-vault

Details

The ai-vault was not in StandardVaultIds, so it was never auto-unlocked

during app startup. Keys were persisted to disk but remained locked in

memory. Adding ai-vault to StandardVaultIds ensures it's unlocked alongside

the master password, just like the connections vault.

patch Desktop Shell

Fix quick action overlay showing empty on subsequent invocations

Desktop 1.51.0 → 1.51.1 | 4adc9816
Details

The ContentPresenter inside the overlay Border was receiving new form

content while detached from the visual tree (Border.IsVisible=false).

On re-entry to the visual tree, Avalonia's ContentPresenter does not

re-process content that was assigned while detached, causing the form's

visual children to never attach — rendering an empty modal.

Two changes fix this:

1. Always reset overlay state (removed the IsQuickActionOverlayOpen

guard) so stale content is cleared even when the overlay appears

closed.

2. Set IsQuickActionOverlayOpen=true BEFORE assigning

QuickActionOverlayContent in the Post callback, ensuring the

ContentPresenter is live in the visual tree when it receives the

new ContentControl. This guarantees template application and visual

child attachment on every invocation.

patch Desktop Shell

Move AI commands to top of command palette, add Clear Chat

Desktop 1.50.7 → 1.50.8 | 611be662
Details

When AI is enabled, "Chat with Duncan", "Clear Duncan Chat", and

"Close Duncan" now appear at the very top of the command palette

results. Previously they were buried after navigation and sync commands.

"Clear Chat" calls ClearAll on the AI tray, clearing all messages,

intent suggestions, and content suggestion history in one action.

AI commands are conditionally included — they don't appear when AI

is disabled, keeping the palette clean.

Version: 1.50.7 → 1.50.8

patch Desktop Shell

Harden local LLM output: minimal prompt, aggressive sanitizer, sentence cap

Desktop 1.50.6 → 1.50.7 | 2991dc5a
Details

Local models (Phi-3, Mistral) ignore multi-rule system prompts and parrot

examples verbatim. Three-pronged fix:

1. System prompt collapsed to a single line with minimal directives. No

numbered rules, no examples. Small models handle one dense instruction

paragraph better than structured rule lists.

2. Sanitizer expanded to strip markdown bold/italic, markdown headers,

bullet prefixes, "Note:" disclaimer lines, and all known chat template

tokens (including Llama 3.x eot_id/start_header_id variants).

3. Hard sentence-count truncation per tier: Short caps at 2 sentences,

Medium at 5, Long at 30. Applied after all other sanitization so the

model physically cannot produce a wall of text for a greeting.

Token budgets also tightened: Short 100->60, Medium 300->250.

← Prev Page 57 of 117 Next →

Get notified about new releases