v1.1.0¶
Release date: 2026-03-17
Overview¶
This release promotes the local Ministral-powered research assistant to a documented and hardened SDK feature.
Highlights¶
- Added the local AI assistant workflow for research-only simulation explanation and reporting.
- Added
iints ai local-checkso users can verify Ollama connectivity and local model readiness before inference. - Hardened local model detection with friendly alias resolution for
ministral-style model names. - Increased local generation resilience with configurable timeouts and automatic prompt payload clipping for large JSON artifacts.
- Expanded the manuals and technical docs so the AI layer now explains:
- how MDMP gating works
- how the Ollama/Ministral backend is selected
- how to debug missing-model and endpoint issues
Included Commands¶
iints ai local-check --model ministral
iints ai explain results/step.json --mdmp-cert results/report.signed.mdmp
iints ai trends results/glucose_payload.json --mdmp-cert results/report.signed.mdmp
iints ai anomalies results/simulation_run.json --mdmp-cert results/report.signed.mdmp
iints ai report results/simulation_run.json --mdmp-cert results/report.signed.mdmp --output results/ai_report.md
Stability Notes¶
- Research use only.
- Not a medical device.
- No clinical dosing advice.
- AI output remains blocked unless MDMP verification succeeds and the minimum grade requirement is met.