Fix user messages excluded from conversation history
Details
User message ViewModels defaulted to ChatMessageState.Loading (enum 0)
since State was never explicitly set. The history filter excluded all
Loading messages, which silently dropped every user turn. Changed filter
to only exclude assistant messages that are still loading.
Fix AI API keys lost on restart by auto-unlocking ai-vault
Details
The ai-vault was not in StandardVaultIds, so it was never auto-unlocked
during app startup. Keys were persisted to disk but remained locked in
memory. Adding ai-vault to StandardVaultIds ensures it's unlocked alongside
the master password, just like the connections vault.
Fix quick action overlay showing empty on subsequent invocations
Details
The ContentPresenter inside the overlay Border was receiving new form
content while detached from the visual tree (Border.IsVisible=false).
On re-entry to the visual tree, Avalonia's ContentPresenter does not
re-process content that was assigned while detached, causing the form's
visual children to never attach — rendering an empty modal.
Two changes fix this:
1. Always reset overlay state (removed the IsQuickActionOverlayOpen
guard) so stale content is cleared even when the overlay appears
closed.
2. Set IsQuickActionOverlayOpen=true BEFORE assigning
QuickActionOverlayContent in the Post callback, ensuring the
ContentPresenter is live in the visual tree when it receives the
new ContentControl. This guarantees template application and visual
child attachment on every invocation.
Add cloud-aware chat with conversation history and persistent memory
Details
Cloud providers (OpenAI, Anthropic, Gemini) now build full message arrays from
ConversationHistory when present, enabling multi-turn conversations. Anthropic
uses automatic prompt caching (cache_control ephemeral) for multi-turn chats,
reducing repeat input cost to 10% on cache hits. AiPersona gains a richer
cloud system prompt with memory context injection and higher token budgets.
AiMemoryService persists learned user facts to JSON at DataPaths.BaseDir,
capped at 50 entries. AiMemoryExtractor runs a fire-and-forget secondary AI
call after cloud responses to extract memorable facts. The chat VM branches:
local models get the existing minimal prompt path; cloud models get history,
memory, and no aggressive sanitization. Desktop version bumped to 1.51.0.
Add AiChatMessage record and ConversationHistory to AiRequest
Details
Introduces AiChatMessage (Role + Content) and an optional ConversationHistory
property on AiRequest for multi-turn chat support. Cloud providers can iterate
over the history to build full message arrays while local providers ignore it.
Backward-compatible — defaults to null. SDK version bumped to 1.51.0.
Get notified about new releases