Skip to main content

Changelog

Every improvement, automatically tracked from our commit history.

Subscribe via Atom feed
← Prev Page 3 of 22 Next →
February 23, 2026
minor Desktop ShellSDK

Chat-based intent execution and AI tray tab highlighting

SDK 1.63.0 → 1.64.0 | Desktop 1.62.2 → 1.63.0 | 859384e3
Details

Duncan can now execute real actions (create tasks, events, notes, etc.) when users

ask in chat. The chat system prompt includes a dynamic intent catalog built from all

active IIntentProvider plugins. The AI responds with [ACTION] blocks containing

intent_id and slots, which are parsed and executed via the new ExecuteDirectAsync

method on IIntentEngine — bypassing the suggestion system for immediate execution.

Confirmation or error messages appear inline in the chat.

AI tray tabs (Chat/Intents/History) now show active state using the view-toggle

pattern with Classes.active bindings driven by IsChatTab/IsIntentsTab/IsHistoryTab

computed properties. Panel visibility is now VM-driven via IsVisible bindings,

removing the imperative ApplyTabVisibility code-behind.

Cloud token budgets bumped: Medium 800→1200, Long 2000→2500, MaxSentences(Long)

30→40 to accommodate intent-aware responses with JSON action blocks.

SDK 1.63.0→1.64.0 (new IIntentEngine.ExecuteDirectAsync interface method).

Desktop 1.62.2→1.63.0.

minor Desktop ShellSDK

AI panel UX improvements: persistent panel, resize handle, plugin context injection

SDK 1.62.0 → 1.63.0 | Desktop 1.61.1 → 1.62.0 | 6255e6e8
Details

Three changes to the AI assistant tray:

1. Panel no longer closes on click-away. Users must explicitly close via the

close button or Escape key, matching the expectation that the panel stays

open while working.

2. Added drag-to-resize handle (same pattern as InfoPanel) allowing users to

widen or narrow the tray between 320-700px. Default remains 400px.

3. Active plugin context is now injected into the AI system prompt. When the

user navigates to a plugin, the AI receives the plugin's name, description,

tags, and category. Plugins can optionally set DetailedDescription on

PluginMetadata for richer context (feature list, capabilities, quick actions).

This lets the AI answer questions like "what is this plugin?" and "what can

I do here?"

SDK 1.63.0: Added PluginMetadata.DetailedDescription property.

Desktop 1.62.0.

minor Desktop Shell

Add tiered AI intent prompts and upgrade local model catalog

Desktop 1.60.7 → 1.61.0 | 98b4504c
Details

Replace old local model catalog (phi-3-mini, llama-1b, mistral-7b) with a

better-tiered selection: llama-3.2-3b (lightweight), qwen-2.5-7b (balanced),

qwen-2.5-14b (high quality), and qwen-2.5-32b (best local quality for 32GB+

systems). Add parameter count metadata to model info for tier detection.

Implement 3-tier intent classification prompt system: Small tier (current

terse few-shot for <5B local models, 384 tokens), Medium tier (richer

instructions with slot descriptions for 7B+ local models, 768 tokens), and

Large tier (full chain-of-thought with intent descriptions, edge case

guidance, and confidence scoring for cloud models, 1024 tokens).

IntentEngine now dynamically selects prompt tier and MaxTokens based on the

active AI provider — cloud providers get Large tier, local 7B+ get Medium,

and small local models get Small. Medium/Large tiers skip intent filtering

to give capable models the full intent set.

Add ChatML prompt template for Qwen models with proper anti-prompts. Update

PlatformDetector RAM thresholds to recommend new models. Old downloaded

models (phi-3, mistral, llama-1b) still work via backward-compatible prompt

template matching.

minor UI Components

Add shared chart controls to UI.Adaptive via LiveCharts2 rc6.1

SDK 1.61.0 → 1.62.0 | fc133d68
Details

Add AdaptiveCartesianChart and AdaptivePieChart wrapper controls to

PrivStack.UI.Adaptive. These code-first Border-based controls encapsulate

LiveCharts2 controls internally, allowing plugins to use charting via

StyledProperty bindings without directly referencing LiveCharts in their

XAML — avoiding the SourceGenChart TypeLoadException that occurs when

LiveCharts controls are XAML-compiled in dynamically-loaded plugin

assemblies.

Any plugin that references UI.Adaptive now gets charting for free.

Version bump: 1.61.0 → 1.62.0

February 22, 2026
minor Desktop ShellSDK

Add conversation history support to local LLM provider

Desktop 1.59.0 → 1.60.0 | af811ab0
Details

Local models were running every message as a standalone prompt with zero

conversational context — follow-up questions like "but my income is 4700"

had no reference to the prior exchange. This made multi-turn chat useless.

FormatPrompt now accepts ConversationHistory and builds proper multi-turn

chat templates for all three model families (Llama 3.x, Mistral, Phi-3),

including correct turn delimiters and system prompt positioning. History

is capped at 6 turns to fit within local context windows.

The chat view model now passes BuildConversationHistory() to local model

requests, giving Duncan the same conversational continuity as cloud models.

Bump version to 1.60.0.

← Prev Page 3 of 22 Next →

Get notified about new releases