2.4. Windows Setup

This guide provides Windows-specific instructions for installing and using Tiny ML Tensorlab.

Note

While Tiny ML Tensorlab works on Windows, Linux is the primary development platform. Some features may have better support on Linux.

2.4.1. Option 1: Native Windows Installation

Step 1: Install Python 3.10

Using pyenv-win (Recommended):

# Run PowerShell as Administrator
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser

# Install pyenv-win
Invoke-WebRequest -UseBasicParsing -Uri "https://raw.githubusercontent.com/pyenv-win/pyenv-win/master/pyenv-win/install-pyenv-win.ps1" -OutFile "./install-pyenv-win.ps1"
& "./install-pyenv-win.ps1"

# Restart PowerShell, then:
pyenv install 3.10.14
pyenv global 3.10.14

# Verify
python --version

Step 2: Install Git

Download and install Git from https://git-scm.com/download/win

During installation:

  • Select “Git from the command line and also from 3rd-party software”

  • Select “Checkout as-is, commit Unix-style line endings”

Step 3: Clone and Install

# Clone repository
git clone https://github.com/TexasInstruments/tinyml-tensorlab.git
cd tinyml-tensorlab

# Create virtual environment
python -m venv venv
.\venv\Scripts\Activate.ps1

# Install components
cd tinyml-modelmaker
pip install -e .

cd ..\tinyml-tinyverse
pip install -e .

cd ..\tinyml-modeloptimization\torchmodelopt
pip install -e .

cd ..\..\tinyml-modelzoo
pip install -e .

Step 4: Configure Environment Variables

Warning

IMPORTANT: Environment Variables Required for Model Compilation

For AI model compilation to work, you MUST set environment variables specific to your target device before running examples.

The variables you need depend on which device you’re targeting:

  • C2000 devices (F28P55, F28P65, etc.): Set C2000_CG_ROOT and C2000WARE_ROOT

  • F29 devices (F29H85X, etc.): Set CG_TOOL_ROOT

  • MSPM0 devices: Set ARM_LLVM_CGT_PATH

  • AM13E devices: Set ARM_LLVM_CGT_PATH

  • AM26x devices: Set ARM_LLVM_CGT_PATH

  • Connectivity devices (CC2755, CC1352, etc.): Set ARM_LLVM_CGT_PATH

See Environment Variables for complete device-specific setup instructions.

Step 5: Run Example

cd tinyml-modelzoo
run_tinyml_modelzoo.bat examples\generic_timeseries_classification\config.yaml

2.4.3. Path Configuration

Long Path Support (Native Windows)

Enable long path support for deep directory structures:

# Run as Administrator
New-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Control\FileSystem" -Name "LongPathsEnabled" -Value 1 -PropertyType DWORD -Force

Environment Variables

Set up paths for TI tools (if compiling for devices):

# Add to your profile or set in System Properties
$env:C2000_CG_ROOT = "C:\ti\ccs\tools\compiler\ti-cgt-c2000_22.6.1.LTS"
$env:C2000WARE_ROOT = "C:\ti\c2000\C2000Ware_5_03_00_00"

2.4.4. Common Windows Issues

PowerShell Execution Policy

If you get “running scripts is disabled”:

Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser

Path Too Long Errors

Clone to a short path (e.g., C:\dev\tensorlab) or enable long paths.

Line Ending Issues

Configure Git to handle line endings:

git config --global core.autocrlf true

Virtual Environment Activation

If venv\Scripts\Activate.ps1 fails:

# Alternative activation
.\venv\Scripts\activate.bat

# Or run Python directly
.\venv\Scripts\python.exe -m tinyml_modelmaker ...

2.4.5. GPU Support on Windows

For CUDA support (useful for NAS):

  1. Install NVIDIA drivers from https://www.nvidia.com/drivers

  2. Install CUDA Toolkit from https://developer.nvidia.com/cuda-downloads

  3. Verify with:

python -c "import torch; print(torch.cuda.is_available())"

2.4.6. WSL2 vs Native Windows Comparison

Feature

Native Windows

WSL2

Setup Complexity

Moderate

Higher initially

Shell Scripts

Requires .bat files

Full bash support

Compilation

Full support

Full support

GPU Support

Native CUDA

WSL2 CUDA (newer)

CCS Integration

Direct

Requires file sharing

Recommended For

Quick start

Full development

2.4.7. Next Steps