Installation And Paths¶
This page is the simplest answer to two common questions:
- "How do I install IINTS correctly?"
- "From which folder am I supposed to run this command?"
Before this page: Quickstart if you simply want the fastest first run.
After this page: Getting Started for the first complete workflow, or Troubleshooting if a command failed.
The Short Rule¶
iints ...commands can run from any working folder once the SDK is installed.iints patient ...commands can also run from any working folder once the SDK is installed.pip install -e ".[...]"must be run from the SDK repository root, wherepyproject.tomllives../scripts/run_live_stage_demo.shand./scripts/run_booth_demo.shbelong to the SDK repository and resolve the repo root automatically.- After
iints quickstart, switch into the generated project folder before running project commands.
Option 1: Install The Released SDK¶
This is the best path for most users.
You can run these commands from any folder:
python3 -m venv .venv
source .venv/bin/activate
python -m pip install -U pip
python -m pip install -U "iints-sdk-python35[full,mdmp]"
Then verify:
iints doctor --smoke-run
python -c "import iints; print(iints.__version__)"
Option 1b: Install The Edge Runtime Profile¶
For Raspberry Pi or UNO Q style deployments, use the lighter edge profile:
python3 -m venv .venv
source .venv/bin/activate
python -m pip install -U pip
python -m pip install -U "iints-sdk-python35[edge,mdmp]"
Use this when you mainly need:
iints patient ...iints edge ...- local FastAPI dashboard
- SQLite runtime state
- CLI control
- UNO Q serial bridge support
- optional local AI review
Security defaults in this profile:
- the digital-patient dashboard API stays on loopback by default
- remote API binding now requires both
--allow-remote-apiand a bearer token source - public dataset downloads without a published SHA-256 now require
--no-verifyso the CLI does not pretend to verify an unknown checksum
Important for UNO Q users:
- if
iints edge ...saysNo such command 'edge', your installed CLI is older than the current docs - in that case, use the source install method below instead of the released package path
- the dedicated UNO Q guide follows the source-install path on purpose so the commands match exactly
Typical SBC bootstrap:
iints edge setup --output-dir iints_edge_demo --board raspberry_pi
cd iints_edge_demo
iints edge up --project-dir .
iints edge status --project-dir .
Option 2: Install From Source¶
Use this only if you are developing the SDK itself.
Exception:
- for
Arduino UNO Q, this is currently the recommended path because it guarantees theiints edge ...commands match the docs
First go to the repository root. That is the folder containing:
pyproject.tomlsrc/scripts/examples/
Then run:
cd /path/to/IINTS-SDK
python3 -m venv .venv
source .venv/bin/activate
python -m pip install -U pip
python -m pip install -U -e ".[full,mdmp]"
If you want the lighter edge runtime from source instead:
python -m pip install -U -e ".[edge,mdmp]"
If you see:
ERROR: ... does not appear to be a Python project
you are almost certainly not inside the repository root.
Optional: Add Local Ollama AI¶
If you want the local research AI features, there is one extra component:
- the SDK itself
- a local Ollama server
- a local open Mistral-family model such as
ministral-3:8borministral-3:3b
The SDK talks to Ollama over HTTP. By default it expects:
http://127.0.0.1:11434
Small Setup Sequence¶
- Install Ollama
On macOS/Linux, the quickest path is:
curl -fsSL https://ollama.com/install.sh | sh
ollama -v
On Windows, install Ollama from the official download page first, then open a new terminal.
- Start Ollama
ollama serve
If Ollama is already running as a background service on your machine, you do not need to start it again.
- Pull a local model
Balanced default:
ollama pull ministral-3:8b
Lighter fallback for smaller machines:
ollama pull ministral-3:3b
- Link Ollama to IINTS and verify
If you use the default local endpoint, the SDK finds Ollama automatically. If you want to make the link explicit, set:
export OLLAMA_HOST=http://127.0.0.1:11434
Then verify the connection:
iints ai local-check --model ministral-3:8b
You can also override the endpoint per command:
iints ai local-check \
--model ministral-3:8b \
--ollama-host http://127.0.0.1:11434
Important:
OLLAMA_HOSTis the normal way to point the SDK at a non-default local Ollama endpoint.- Remote Ollama endpoints are blocked by default for safety. Only enable them intentionally.
- If
ministral-3:8bis unstable on your hardware, tryministral-3:3bfirst. - Full AI usage guide:
docs/AI_ASSISTANT.md
Optional: Turn A Raspberry Pi Into A Live Digital Patient¶
If you want a persistent expo or classroom rig, the SDK now includes:
iints patient startiints patient statusiints patient inject-mealiints patient expo-resetiints patient review
Recommended setup:
- Raspberry Pi 5
- Raspberry Pi OS Desktop (Bookworm or newer)
- Raspberry Pi Connect enabled
Fastest flow:
python3 -m venv .venv
source .venv/bin/activate
python -m pip install -U pip
python -m pip install -U "iints-sdk-python35[edge,mdmp]"
iints quickstart --project-name iints_pi_demo
cd iints_pi_demo
iints patient start \
--algo algorithms/example_algorithm.py \
--workspace patient_runtime \
--scenario-profile normal_day \
--mode demo-time \
--speed 60x
Then open:
http://127.0.0.1:8765/dashboard
Use Raspberry Pi Connect screen sharing from your laptop to present the live dashboard.
Remote presentation note:
- Raspberry Pi Connect does not require
--allow-remote-api - the safest demo path is still to keep the API on
127.0.0.1 - only use remote API binding if another machine must talk to the dashboard directly
If you really do need a remote API bind, use a token-backed start command:
export IINTS_PATIENT_API_TOKEN="replace-this-with-a-random-secret"
iints patient start \
--algo algorithms/example_algorithm.py \
--workspace patient_runtime \
--scenario-profile normal_day \
--mode demo-time \
--speed 60x \
--api-host 0.0.0.0 \
--allow-remote-api \
--api-token-env IINTS_PATIENT_API_TOKEN
When token protection is enabled:
- browser reads of
/dashboard,/kiosk,/status, and glucose history also require the token - the simplest browser form is
http://<host>:8765/dashboard?token=<your-token> - command-line or scripted access should prefer
Authorization: Bearer <token>
Dataset Fetch Verification¶
iints data fetch is stricter now.
If a public source does not publish a pinned SHA-256, the SDK will no longer call that a verified download.
So this can now happen intentionally:
iints data fetch aide_t1d --output-dir data_packs/public/aide_t1d
If the registry entry has no pinned hash yet, the secure fallback is:
iints data fetch aide_t1d \
--output-dir data_packs/public/aide_t1d \
--no-verify
Use --no-verify only when:
- you trust the upstream source
- the dataset entry still lacks a published checksum
- you understand this is a trust decision, not a cryptographic verification
The long-term fix is to add a pinned SHA-256 to src/iints/data/datasets.json.
Safer Nightscout And Tidepool Secrets¶
Prefer environment variables or files over plain CLI secrets.
Nightscout:
export IINTS_NIGHTSCOUT_SECRET="replace-me"
export IINTS_NIGHTSCOUT_TOKEN="replace-me"
iints import-nightscout \
--url https://your-nightscout.example \
--api-secret-env IINTS_NIGHTSCOUT_SECRET \
--token-env IINTS_NIGHTSCOUT_TOKEN \
--output-dir results/nightscout_import
Tidepool:
export IINTS_TIDEPOOL_TOKEN="replace-me"
iints import-tidepool \
--base-url https://api.tidepool.org \
--token-env IINTS_TIDEPOOL_TOKEN
Plain --token and --api-secret still work for compatibility, but the CLI now warns because those values can leak into shell history and process lists.
If this Pi will be left running unattended, export a ready-made systemd unit after the first start:
iints edge service --project-dir .
Full guide:
docs/DIGITAL_PATIENT_PI.md
Folder Map¶
There are three important places to keep straight:
1. SDK repository root¶
Example:
/path/to/IINTS-SDK
Use this for:
python -m pip install -e ".[full,mdmp]"./scripts/run_live_stage_demo.sh./scripts/run_booth_demo.sh- opening
examples/demos/07_live_stage_demo.py
2. Generated quickstart project¶
Created by:
iints quickstart --project-name iints_quickstart
Example:
/path/to/where/you/running/iints_quickstart
Use this for:
iints presets run --name baseline_t1d --algo algorithms/example_algorithm.py- editing
algorithms/example_algorithm.py - inspecting
results/
3. Run bundle¶
A single simulation run ends up under something like:
results/20260323-123456-abcdef12-1234/
That run bundle contains files such as:
results.csvclinical_report.pdfaudit/baseline/run_manifest.jsonrun_metadata.json
Fastest Working Flow¶
Installed SDK flow¶
python3 -m venv .venv
source .venv/bin/activate
python -m pip install -U pip
python -m pip install -U "iints-sdk-python35[full,mdmp]"
iints quickstart --project-name iints_quickstart
cd iints_quickstart
iints presets run --name baseline_t1d --algo algorithms/example_algorithm.py
Source repo flow¶
cd /path/to/IINTS-SDK
python3 -m venv .venv
source .venv/bin/activate
python -m pip install -U pip
python -m pip install -U -e ".[full,mdmp]"
./scripts/run_live_stage_demo.sh
Booth Demo Paths¶
If you only installed the SDK and do not have the repository checkout, export the same showable booth code with:
iints demo-export --output-dir iints_demo
cd iints_demo
python 07_live_stage_demo.py
That writes:
07_live_stage_demo.pyRUN_ME_FIRST.txt
If you use:
./scripts/run_live_stage_demo.sh
the default output folder is:
<repo-root>/results/booth_demo_live/
The most useful files there are:
booth_demo_poster.pngJURY_TALK_TRACK.mdBEURS_LIVE_DEMO_SCRIPT.txtrun_commands.md
The three scenario folders are:
01_normal_run/02_meal_stress_test/03_supervisor_override/
Quick Troubleshooting¶
iints ai or iints demo-booth is missing¶
iints-sdk-doctor
If needed:
python -m pip uninstall -y iints iints-sdk-python35
python -m pip install -U "iints-sdk-python35[full,mdmp]"
hash -r
pip install -e ".[full,mdmp]" or pip install -e ".[edge,mdmp]" fails¶
Move into the SDK repository root first:
cd /path/to/IINTS-SDK
python -m pip install -e ".[full,mdmp]"
Wrong Python version¶
Current releases require Python >=3.10.
Check it:
python --version
Where To Go Next¶
| If you installed... | Continue with |
|---|---|
| the full SDK | Getting Started |
| the edge profile | Raspberry Pi Digital Patient |
| local AI extras | AI Assistant |
| data certification tools | MDMP Quickstart |
| but something failed | Troubleshooting |