v1.1.1¶
Release date: 2026-03-18
Overview¶
This patch release corrects the local AI default to the current open Ministral 3 model line and adds clearer hardware guidance for users running different local setups.
Highlights¶
- Switched the default local AI model target to the open-weight
Ministral 3family via Ollama - Kept older local
Ministral 8Btags as backward-compatible fallbacks - Added Ollama runtime compatibility checks for the open
Ministral 3path - Added
iints ai modelsso users can compare local model options before downloading - Added RAM and VRAM guidance in the docs and manuals for:
ministral-3:3bministral-3:8bministral-3:14b
Included Commands¶
iints ai models
ollama pull ministral-3:8b
iints ai local-check --model ministral-3:8b
iints ai explain results/step.json --mdmp-cert results/report.signed.mdmp
Stability Notes¶
- Research use only
- Not a medical device
- No clinical dosing advice
- AI output remains blocked unless MDMP verification succeeds and the minimum grade requirement is met