Compare commits
4 Commits
93b6cd136c
...
44e71fd3a5
| Author | SHA1 | Date | |
|---|---|---|---|
| 44e71fd3a5 | |||
| 40daf20809 | |||
| ed12f04549 | |||
| 1f527476e6 |
73
README.md
73
README.md
@@ -1,10 +1,13 @@
|
|||||||
# TrueMigration
|
# TrueMigration
|
||||||
|
|
||||||
A Python CLI tool for migrating TrueNAS configuration from a debug archive to a live destination system. Designed for systems integration teams working in pre-production deployment environments.
|
A Python CLI tool for migrating SMB and NFS share configuration to a live TrueNAS destination system. Designed for systems integration teams working in pre-production deployment environments.
|
||||||
|
|
||||||
## What It Does
|
## What It Does
|
||||||
|
|
||||||
TrueMigration reads configuration data out of a TrueNAS debug archive (the `.tgz` produced by **System → Save Debug** in the TrueNAS UI) and re-creates that configuration on a destination TrueNAS system via its WebSocket API.
|
TrueMigration reads share configuration from a source and re-creates it on a destination TrueNAS system via its WebSocket API. Two source types are supported:
|
||||||
|
|
||||||
|
- **TrueNAS debug archive** — the `.tgz` produced by **System → Save Debug** in the TrueNAS UI
|
||||||
|
- **CSV files** — customer-supplied spreadsheets for migrating from non-TrueNAS sources
|
||||||
|
|
||||||
**Currently supported:**
|
**Currently supported:**
|
||||||
- SMB shares
|
- SMB shares
|
||||||
@@ -22,19 +25,19 @@ TrueMigration reads configuration data out of a TrueNAS debug archive (the `.tgz
|
|||||||
|
|
||||||
### Interactive Mode (recommended)
|
### Interactive Mode (recommended)
|
||||||
|
|
||||||
Drop a TrueNAS debug archive in the current directory and run with no arguments. The wizard will guide you through archive selection, destination configuration, per-share selection, a dry run preview, and final confirmation before making any changes.
|
Run with no arguments. The wizard will guide you through source selection, destination configuration, per-share filtering, a dry run preview, and final confirmation before making any changes.
|
||||||
|
|
||||||
```bash
|
|
||||||
python deploy.py
|
|
||||||
```
|
|
||||||
|
|
||||||
or
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
python -m truenas_migrate
|
python -m truenas_migrate
|
||||||
```
|
```
|
||||||
|
|
||||||
### Command Line Mode
|
or
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python deploy.py
|
||||||
|
```
|
||||||
|
|
||||||
|
### Command Line Mode — Archive Source
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Inspect the archive before doing anything
|
# Inspect the archive before doing anything
|
||||||
@@ -61,6 +64,47 @@ python -m truenas_migrate \
|
|||||||
--migrate smb
|
--migrate smb
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Command Line Mode — CSV Source
|
||||||
|
|
||||||
|
Fill in the provided template files and pass them on the command line. You can supply one or both.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Dry run from CSV files
|
||||||
|
python -m truenas_migrate \
|
||||||
|
--smb-csv smb_shares.csv \
|
||||||
|
--nfs-csv nfs_shares.csv \
|
||||||
|
--dest 192.168.1.50 \
|
||||||
|
--api-key "1-xxxxxxxxxxxx" \
|
||||||
|
--dry-run
|
||||||
|
|
||||||
|
# Live migration — SMB only
|
||||||
|
python -m truenas_migrate \
|
||||||
|
--smb-csv smb_shares.csv \
|
||||||
|
--dest 192.168.1.50 \
|
||||||
|
--api-key "1-xxxxxxxxxxxx"
|
||||||
|
```
|
||||||
|
|
||||||
|
### CSV Templates
|
||||||
|
|
||||||
|
Copy and fill in the templates included in this repository:
|
||||||
|
|
||||||
|
| File | Purpose |
|
||||||
|
|------|---------|
|
||||||
|
| `smb_shares_template.csv` | One row per SMB share |
|
||||||
|
| `nfs_shares_template.csv` | One row per NFS export |
|
||||||
|
|
||||||
|
Each template includes a header row, annotated comment rows explaining valid values for each column, and one example data row to replace. Lines beginning with `#` are ignored by the parser.
|
||||||
|
|
||||||
|
**SMB columns:** `Share Name` *(required)*, `Path` *(required)*, `Description`, `Purpose`, `Read Only`, `Browsable`, `Guest Access`, `Access-Based Enumeration`, `Hosts Allow`, `Hosts Deny`, `Time Machine`, `Enabled`
|
||||||
|
|
||||||
|
**NFS columns:** `Path` *(required)*, `Description`, `Read Only`, `Map Root User`, `Map Root Group`, `Map All User`, `Map All Group`, `Security`, `Allowed Hosts`, `Allowed Networks`, `Enabled`
|
||||||
|
|
||||||
|
Boolean columns (`Read Only`, `Browsable`, etc.) accept `true` or `false`. List columns (`Hosts Allow`, `Hosts Deny`, `Security`, `Allowed Hosts`, `Allowed Networks`) accept space-separated values.
|
||||||
|
|
||||||
|
Valid `Purpose` values: `NO_PRESET`, `DEFAULT_SHARE`, `ENHANCED_TIMEMACHINE`, `MULTI_PROTOCOL_NFS`, `PRIVATE_DATASETS`, `WORM_DROPBOX`
|
||||||
|
|
||||||
|
Valid `Security` values: `SYS`, `KRB5`, `KRB5I`, `KRB5P`
|
||||||
|
|
||||||
### Generating an API Key
|
### Generating an API Key
|
||||||
|
|
||||||
In the TrueNAS UI: top-right account menu → **API Keys** → **Add**.
|
In the TrueNAS UI: top-right account menu → **API Keys** → **Add**.
|
||||||
@@ -79,21 +123,24 @@ Always run with `--dry-run` first to preview what will and won't be created.
|
|||||||
## Archive Compatibility
|
## Archive Compatibility
|
||||||
|
|
||||||
| Source version | Archive format | Notes |
|
| Source version | Archive format | Notes |
|
||||||
|----------------|------------------------|--------------------------------------------|
|
|----------------|-------------------------|---------------------------------------------|
|
||||||
| SCALE 24.04+ | ixdiagnose (lowercase) | Combined JSON plugin files |
|
| SCALE 24.04+ | ixdiagnose (lowercase) | Combined JSON plugin files |
|
||||||
| SCALE (older) | ixdiagnose (uppercase) | Per-query JSON files |
|
| SCALE (older) | ixdiagnose (uppercase) | Per-query JSON files |
|
||||||
| CORE | freenas-debug/fndebug | Plain-text dumps with embedded JSON blocks |
|
| CORE | freenas-debug / fndebug | Plain-text dumps with embedded JSON blocks |
|
||||||
| HA bundles | Outer .tgz + inner .txz| Active node archive selected automatically |
|
| HA bundles | Outer .tgz + inner .txz | Active node archive selected automatically |
|
||||||
|
|
||||||
## Project Structure
|
## Project Structure
|
||||||
|
|
||||||
```
|
```
|
||||||
deploy.py # Entry point shim
|
deploy.py # Entry point shim
|
||||||
|
smb_shares_template.csv # SMB CSV template for customers
|
||||||
|
nfs_shares_template.csv # NFS CSV template for customers
|
||||||
truenas_migrate/
|
truenas_migrate/
|
||||||
__main__.py # python -m truenas_migrate
|
__main__.py # python -m truenas_migrate
|
||||||
colors.py # ANSI color helpers and shared logger
|
colors.py # ANSI color helpers and shared logger
|
||||||
summary.py # Migration summary and report
|
summary.py # Migration summary and report
|
||||||
archive.py # Debug archive parser
|
archive.py # Debug archive parser
|
||||||
|
csv_source.py # CSV parser for non-TrueNAS sources
|
||||||
client.py # TrueNAS WebSocket API client
|
client.py # TrueNAS WebSocket API client
|
||||||
migrate.py # SMB and NFS migration routines
|
migrate.py # SMB and NFS migration routines
|
||||||
cli.py # Interactive wizard and argument parser
|
cli.py # Interactive wizard and argument parser
|
||||||
|
|||||||
3
nfs_shares_template.csv
Normal file
3
nfs_shares_template.csv
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
Path,Description,Read Only,Map Root User,Map Root Group,Map All User,Map All Group,Security,Allowed Hosts,Allowed Networks,Enabled
|
||||||
|
/mnt/tank/data,Primary data export,false,root,wheel,,,SYS,,192.168.1.0/24,true
|
||||||
|
/mnt/tank/media,Media files read-only,true,,,,,,,,true
|
||||||
|
3
smb_shares_template.csv
Normal file
3
smb_shares_template.csv
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
Share Name,Path,Description,Purpose,Read Only,Browsable,Guest Access,Access-Based Enumeration,Hosts Allow,Hosts Deny,Time Machine,Enabled
|
||||||
|
Accounting,/mnt/tank/accounting,Accounting department files,NO_PRESET,false,true,false,false,,,false,true
|
||||||
|
Public,/mnt/tank/public,Public read-only share,NO_PRESET,true,true,true,false,,,false,true
|
||||||
|
@@ -1,9 +1,9 @@
|
|||||||
"""
|
"""
|
||||||
truenas_migrate – TrueNAS Share Migration Tool
|
truenas_migrate – TrueNAS Share Migration Tool
|
||||||
=================================================
|
=================================================
|
||||||
Reads SMB shares and NFS shares from a TrueNAS debug archive (.tar / .tgz)
|
Reads SMB shares and NFS shares from either a TrueNAS debug archive (.tar / .tgz)
|
||||||
produced by the built-in "Save Debug" feature, then re-creates them on a
|
or customer-supplied CSV files, then re-creates them on a destination TrueNAS
|
||||||
destination TrueNAS system via the JSON-RPC 2.0 WebSocket API (TrueNAS 25.04+).
|
system via the JSON-RPC 2.0 WebSocket API (TrueNAS 25.04+).
|
||||||
|
|
||||||
SAFE BY DEFAULT
|
SAFE BY DEFAULT
|
||||||
• Existing shares are never overwritten or deleted.
|
• Existing shares are never overwritten or deleted.
|
||||||
@@ -12,7 +12,7 @@ SAFE BY DEFAULT
|
|||||||
REQUIREMENTS
|
REQUIREMENTS
|
||||||
Python 3.9+ (stdlib only – no external packages needed)
|
Python 3.9+ (stdlib only – no external packages needed)
|
||||||
|
|
||||||
QUICK START
|
QUICK START — Archive source
|
||||||
# 1. Inspect your debug archive to confirm it contains the data you need:
|
# 1. Inspect your debug archive to confirm it contains the data you need:
|
||||||
python -m truenas_migrate --debug-tar debug.tgz --list-archive
|
python -m truenas_migrate --debug-tar debug.tgz --list-archive
|
||||||
|
|
||||||
@@ -29,12 +29,14 @@ QUICK START
|
|||||||
--dest 192.168.1.50 \\
|
--dest 192.168.1.50 \\
|
||||||
--api-key "1-xxxxxxxxxxxx"
|
--api-key "1-xxxxxxxxxxxx"
|
||||||
|
|
||||||
# 4. Migrate only SMB shares (skip NFS):
|
QUICK START — CSV source
|
||||||
|
# Fill in smb_shares_template.csv / nfs_shares_template.csv, then:
|
||||||
python -m truenas_migrate \\
|
python -m truenas_migrate \\
|
||||||
--debug-tar debug.tgz \\
|
--smb-csv smb_shares.csv \\
|
||||||
|
--nfs-csv nfs_shares.csv \\
|
||||||
--dest 192.168.1.50 \\
|
--dest 192.168.1.50 \\
|
||||||
--api-key "1-xxxxxxxxxxxx" \\
|
--api-key "1-xxxxxxxxxxxx" \\
|
||||||
--migrate smb
|
--dry-run
|
||||||
|
|
||||||
CONFLICT POLICY
|
CONFLICT POLICY
|
||||||
Shares that already exist on the destination are silently skipped:
|
Shares that already exist on the destination are silently skipped:
|
||||||
@@ -53,7 +55,8 @@ from typing import Optional
|
|||||||
|
|
||||||
from .archive import parse_archive, list_archive_and_exit
|
from .archive import parse_archive, list_archive_and_exit
|
||||||
from .client import TrueNASClient, check_dataset_paths, create_missing_datasets
|
from .client import TrueNASClient, check_dataset_paths, create_missing_datasets
|
||||||
from .colors import log, _bold, _bold_cyan, _bold_yellow, _cyan, _dim, _green, _yellow
|
from .colors import log, _bold, _bold_cyan, _bold_red, _bold_yellow, _cyan, _dim, _green, _yellow
|
||||||
|
from .csv_source import parse_csv_sources
|
||||||
from .migrate import migrate_smb_shares, migrate_nfs_shares
|
from .migrate import migrate_smb_shares, migrate_nfs_shares
|
||||||
from .summary import Summary
|
from .summary import Summary
|
||||||
|
|
||||||
@@ -67,7 +70,13 @@ async def run(
|
|||||||
archive: Optional[dict] = None,
|
archive: Optional[dict] = None,
|
||||||
) -> Summary:
|
) -> Summary:
|
||||||
if archive is None:
|
if archive is None:
|
||||||
|
smb_csv = getattr(args, "smb_csv", None)
|
||||||
|
nfs_csv = getattr(args, "nfs_csv", None)
|
||||||
|
if smb_csv or nfs_csv:
|
||||||
|
archive = parse_csv_sources(smb_csv, nfs_csv)
|
||||||
|
else:
|
||||||
archive = parse_archive(args.debug_tar)
|
archive = parse_archive(args.debug_tar)
|
||||||
|
|
||||||
migrate_set = set(args.migrate)
|
migrate_set = set(args.migrate)
|
||||||
|
|
||||||
if args.dry_run:
|
if args.dry_run:
|
||||||
@@ -133,6 +142,20 @@ def _confirm(label: str) -> bool:
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def _prompt_csv_path(share_type: str) -> Optional[str]:
|
||||||
|
"""Prompt for a CSV file path. Returns resolved path string or None if skipped."""
|
||||||
|
template = f"{share_type.lower()}_shares_template.csv"
|
||||||
|
print(f" {_dim('(template: ' + template + ')')}")
|
||||||
|
while True:
|
||||||
|
raw = _prompt(f" {share_type} shares CSV path (Enter to skip)")
|
||||||
|
if not raw:
|
||||||
|
return None
|
||||||
|
p = Path(raw)
|
||||||
|
if p.is_file():
|
||||||
|
return str(p)
|
||||||
|
print(f" {_bold_red('File not found:')} {raw}")
|
||||||
|
|
||||||
|
|
||||||
def _select_shares(shares: list[dict], share_type: str) -> list[dict]:
|
def _select_shares(shares: list[dict], share_type: str) -> list[dict]:
|
||||||
"""
|
"""
|
||||||
Display a numbered list of *shares* and return only those the user selects.
|
Display a numbered list of *shares* and return only those the user selects.
|
||||||
@@ -141,7 +164,7 @@ def _select_shares(shares: list[dict], share_type: str) -> list[dict]:
|
|||||||
if not shares:
|
if not shares:
|
||||||
return shares
|
return shares
|
||||||
|
|
||||||
print(f"\n {_bold(f'{share_type} shares in archive ({len(shares)}):')} \n")
|
print(f"\n {_bold(f'{share_type} shares ({len(shares)}):')} \n")
|
||||||
for i, share in enumerate(shares, 1):
|
for i, share in enumerate(shares, 1):
|
||||||
if share_type == "SMB":
|
if share_type == "SMB":
|
||||||
name = share.get("name", "<unnamed>")
|
name = share.get("name", "<unnamed>")
|
||||||
@@ -189,13 +212,63 @@ def _select_shares(shares: list[dict], share_type: str) -> list[dict]:
|
|||||||
# ─────────────────────────────────────────────────────────────────────────────
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
def interactive_mode() -> None:
|
def interactive_mode() -> None:
|
||||||
"""Interactive wizard: pick archive → configure → dry run → confirm → apply."""
|
"""Interactive wizard: pick source → configure → dry run → confirm → apply."""
|
||||||
print(
|
print(
|
||||||
f"\n{_bold_cyan(' TrueNAS Share Migration Tool')}\n"
|
f"\n{_bold_cyan(' TrueNAS Share Migration Tool')}\n"
|
||||||
f" {_dim('Migrate SMB/NFS shares from a debug archive to a live system.')}\n"
|
f" {_dim('Migrate SMB/NFS shares to a live TrueNAS system.')}\n"
|
||||||
)
|
)
|
||||||
|
|
||||||
# 1 ── Locate debug archive ────────────────────────────────────────────────
|
# 1 ── Source type ──────────────────────────────────────────────────────────
|
||||||
|
print(f" {_bold('Source type:')}")
|
||||||
|
print(f" {_cyan('1.')} TrueNAS debug archive (.tgz / .tar)")
|
||||||
|
print(f" {_cyan('2.')} CSV import (non-TrueNAS source)")
|
||||||
|
src_raw = _prompt(" Select source [1/2]", default="1")
|
||||||
|
use_csv = src_raw.strip() == "2"
|
||||||
|
print()
|
||||||
|
|
||||||
|
# 2 ── Destination ──────────────────────────────────────────────────────────
|
||||||
|
host = ""
|
||||||
|
while not host:
|
||||||
|
host = _prompt("Destination TrueNAS host or IP")
|
||||||
|
if not host:
|
||||||
|
print(" Host is required.")
|
||||||
|
|
||||||
|
port_raw = _prompt("WebSocket port", default="443")
|
||||||
|
port = int(port_raw) if port_raw.isdigit() else 443
|
||||||
|
|
||||||
|
# 3 ── API key ──────────────────────────────────────────────────────────────
|
||||||
|
api_key = ""
|
||||||
|
while not api_key:
|
||||||
|
try:
|
||||||
|
api_key = getpass.getpass("API key (input hidden): ").strip()
|
||||||
|
except (EOFError, KeyboardInterrupt):
|
||||||
|
print()
|
||||||
|
sys.exit(0)
|
||||||
|
if not api_key:
|
||||||
|
print(" API key is required.")
|
||||||
|
|
||||||
|
if use_csv:
|
||||||
|
# ── CSV source ──────────────────────────────────────────────────────────
|
||||||
|
print(f"\n {_bold('CSV file paths:')}")
|
||||||
|
print(f" {_dim('Press Enter to skip a share type.')}\n")
|
||||||
|
smb_csv_path = _prompt_csv_path("SMB")
|
||||||
|
print()
|
||||||
|
nfs_csv_path = _prompt_csv_path("NFS")
|
||||||
|
|
||||||
|
migrate: list[str] = []
|
||||||
|
if smb_csv_path:
|
||||||
|
migrate.append("smb")
|
||||||
|
if nfs_csv_path:
|
||||||
|
migrate.append("nfs")
|
||||||
|
if not migrate:
|
||||||
|
sys.exit("No CSV files provided – nothing to migrate.")
|
||||||
|
|
||||||
|
print()
|
||||||
|
archive_data = parse_csv_sources(smb_csv_path, nfs_csv_path)
|
||||||
|
extra_ns: dict = {"smb_csv": smb_csv_path, "nfs_csv": nfs_csv_path}
|
||||||
|
|
||||||
|
else:
|
||||||
|
# ── Archive source ──────────────────────────────────────────────────────
|
||||||
archives = _find_debug_archives()
|
archives = _find_debug_archives()
|
||||||
if not archives:
|
if not archives:
|
||||||
sys.exit(
|
sys.exit(
|
||||||
@@ -221,29 +294,7 @@ def interactive_mode() -> None:
|
|||||||
break
|
break
|
||||||
print(f" Enter a number from 1 to {len(archives)}.")
|
print(f" Enter a number from 1 to {len(archives)}.")
|
||||||
|
|
||||||
# 2 ── Destination ─────────────────────────────────────────────────────────
|
# ── Migration scope ─────────────────────────────────────────────────────
|
||||||
print()
|
|
||||||
host = ""
|
|
||||||
while not host:
|
|
||||||
host = _prompt("Destination TrueNAS host or IP")
|
|
||||||
if not host:
|
|
||||||
print(" Host is required.")
|
|
||||||
|
|
||||||
port_raw = _prompt("WebSocket port", default="443")
|
|
||||||
port = int(port_raw) if port_raw.isdigit() else 443
|
|
||||||
|
|
||||||
# 3 ── API key ─────────────────────────────────────────────────────────────
|
|
||||||
api_key = ""
|
|
||||||
while not api_key:
|
|
||||||
try:
|
|
||||||
api_key = getpass.getpass("API key (input hidden): ").strip()
|
|
||||||
except (EOFError, KeyboardInterrupt):
|
|
||||||
print()
|
|
||||||
sys.exit(0)
|
|
||||||
if not api_key:
|
|
||||||
print(" API key is required.")
|
|
||||||
|
|
||||||
# 4 ── Migration scope ─────────────────────────────────────────────────────
|
|
||||||
print(f"\n {_bold('What to migrate?')}")
|
print(f"\n {_bold('What to migrate?')}")
|
||||||
print(f" {_cyan('1.')} SMB shares")
|
print(f" {_cyan('1.')} SMB shares")
|
||||||
print(f" {_cyan('2.')} NFS shares")
|
print(f" {_cyan('2.')} NFS shares")
|
||||||
@@ -251,18 +302,19 @@ def interactive_mode() -> None:
|
|||||||
"Selection (space-separated numbers, Enter for all)", default="1 2"
|
"Selection (space-separated numbers, Enter for all)", default="1 2"
|
||||||
)
|
)
|
||||||
_sel_map = {"1": "smb", "2": "nfs"}
|
_sel_map = {"1": "smb", "2": "nfs"}
|
||||||
migrate: list[str] = []
|
migrate = []
|
||||||
for tok in sel_raw.split():
|
for tok in sel_raw.split():
|
||||||
if tok in _sel_map and _sel_map[tok] not in migrate:
|
if tok in _sel_map and _sel_map[tok] not in migrate:
|
||||||
migrate.append(_sel_map[tok])
|
migrate.append(_sel_map[tok])
|
||||||
if not migrate:
|
if not migrate:
|
||||||
migrate = ["smb", "nfs"]
|
migrate = ["smb", "nfs"]
|
||||||
|
|
||||||
# 5 ── Parse archive once (reused for dry + live runs) ────────────────────
|
# ── Parse archive ───────────────────────────────────────────────────────
|
||||||
print()
|
print()
|
||||||
archive_data = parse_archive(str(chosen))
|
archive_data = parse_archive(str(chosen))
|
||||||
|
extra_ns = {"debug_tar": str(chosen)}
|
||||||
|
|
||||||
# 5b ── Select individual shares ───────────────────────────────────────────
|
# ── Select individual shares (common) ──────────────────────────────────────
|
||||||
if "smb" in migrate and archive_data["smb_shares"]:
|
if "smb" in migrate and archive_data["smb_shares"]:
|
||||||
archive_data["smb_shares"] = _select_shares(archive_data["smb_shares"], "SMB")
|
archive_data["smb_shares"] = _select_shares(archive_data["smb_shares"], "SMB")
|
||||||
if "nfs" in migrate and archive_data["nfs_shares"]:
|
if "nfs" in migrate and archive_data["nfs_shares"]:
|
||||||
@@ -270,15 +322,15 @@ def interactive_mode() -> None:
|
|||||||
print()
|
print()
|
||||||
|
|
||||||
base_ns = dict(
|
base_ns = dict(
|
||||||
debug_tar=str(chosen),
|
|
||||||
dest=host,
|
dest=host,
|
||||||
port=port,
|
port=port,
|
||||||
api_key=api_key,
|
api_key=api_key,
|
||||||
verify_ssl=False,
|
verify_ssl=False,
|
||||||
migrate=migrate,
|
migrate=migrate,
|
||||||
|
**extra_ns,
|
||||||
)
|
)
|
||||||
|
|
||||||
# 6 ── Dry run ─────────────────────────────────────────────────────────────
|
# 6 ── Dry run ──────────────────────────────────────────────────────────────
|
||||||
dry_summary = asyncio.run(
|
dry_summary = asyncio.run(
|
||||||
run(argparse.Namespace(**base_ns, dry_run=True), archive_data)
|
run(argparse.Namespace(**base_ns, dry_run=True), archive_data)
|
||||||
)
|
)
|
||||||
@@ -314,7 +366,7 @@ def interactive_mode() -> None:
|
|||||||
print("Aborted – no changes made.")
|
print("Aborted – no changes made.")
|
||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
|
|
||||||
# 7 ── Live run ────────────────────────────────────────────────────────────
|
# 7 ── Live run ─────────────────────────────────────────────────────────────
|
||||||
print()
|
print()
|
||||||
live_summary = asyncio.run(
|
live_summary = asyncio.run(
|
||||||
run(argparse.Namespace(**base_ns, dry_run=False), archive_data)
|
run(argparse.Namespace(**base_ns, dry_run=False), archive_data)
|
||||||
@@ -336,23 +388,32 @@ def main() -> None:
|
|||||||
p = argparse.ArgumentParser(
|
p = argparse.ArgumentParser(
|
||||||
prog="truenas_migrate",
|
prog="truenas_migrate",
|
||||||
description=(
|
description=(
|
||||||
"Migrate SMB and NFS shares from a TrueNAS debug archive "
|
"Migrate SMB and NFS shares to a live TrueNAS destination system. "
|
||||||
"to a live destination system."
|
"Source can be a TrueNAS debug archive or customer-supplied CSV files."
|
||||||
),
|
),
|
||||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||||
epilog=__doc__,
|
epilog=__doc__,
|
||||||
)
|
)
|
||||||
|
|
||||||
# ── Source ────────────────────────────────────────────────────────────────
|
# ── Source ────────────────────────────────────────────────────────────────
|
||||||
p.add_argument(
|
src = p.add_argument_group("source (choose one)")
|
||||||
"--debug-tar", required=True, metavar="FILE",
|
src.add_argument(
|
||||||
|
"--debug-tar", metavar="FILE",
|
||||||
help="Path to the TrueNAS debug .tar / .tgz from the SOURCE system.",
|
help="Path to the TrueNAS debug .tar / .tgz from the SOURCE system.",
|
||||||
)
|
)
|
||||||
|
src.add_argument(
|
||||||
|
"--smb-csv", metavar="FILE",
|
||||||
|
help="Path to a CSV file containing SMB share definitions (non-TrueNAS source).",
|
||||||
|
)
|
||||||
|
src.add_argument(
|
||||||
|
"--nfs-csv", metavar="FILE",
|
||||||
|
help="Path to a CSV file containing NFS share definitions (non-TrueNAS source).",
|
||||||
|
)
|
||||||
p.add_argument(
|
p.add_argument(
|
||||||
"--list-archive", action="store_true",
|
"--list-archive", action="store_true",
|
||||||
help=(
|
help=(
|
||||||
"List all JSON files found in the archive and exit. "
|
"List all JSON files found in the archive and exit. "
|
||||||
"Run this first to verify the archive contains share data."
|
"Requires --debug-tar."
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -396,7 +457,7 @@ def main() -> None:
|
|||||||
)
|
)
|
||||||
p.add_argument(
|
p.add_argument(
|
||||||
"--dry-run", action="store_true",
|
"--dry-run", action="store_true",
|
||||||
help="Parse archive and connect to destination, but make no changes.",
|
help="Parse source and connect to destination, but make no changes.",
|
||||||
)
|
)
|
||||||
p.add_argument(
|
p.add_argument(
|
||||||
"--verbose", "-v", action="store_true",
|
"--verbose", "-v", action="store_true",
|
||||||
@@ -408,14 +469,32 @@ def main() -> None:
|
|||||||
if args.verbose:
|
if args.verbose:
|
||||||
log.setLevel(logging.DEBUG)
|
log.setLevel(logging.DEBUG)
|
||||||
|
|
||||||
|
has_archive = bool(args.debug_tar)
|
||||||
|
has_csv = bool(args.smb_csv or args.nfs_csv)
|
||||||
|
|
||||||
|
if has_archive and has_csv:
|
||||||
|
p.error("Cannot combine --debug-tar with --smb-csv / --nfs-csv.")
|
||||||
|
|
||||||
|
if not has_archive and not has_csv:
|
||||||
|
p.error(
|
||||||
|
"Specify a source: --debug-tar FILE or --smb-csv / --nfs-csv FILE(s)."
|
||||||
|
)
|
||||||
|
|
||||||
|
if has_archive:
|
||||||
if not Path(args.debug_tar).is_file():
|
if not Path(args.debug_tar).is_file():
|
||||||
p.error(f"Archive not found: {args.debug_tar}")
|
p.error(f"Archive not found: {args.debug_tar}")
|
||||||
|
|
||||||
if args.list_archive:
|
if args.list_archive:
|
||||||
list_archive_and_exit(args.debug_tar) # does not return
|
list_archive_and_exit(args.debug_tar) # does not return
|
||||||
|
else:
|
||||||
|
if args.list_archive:
|
||||||
|
p.error("--list-archive requires --debug-tar.")
|
||||||
|
if args.smb_csv and not Path(args.smb_csv).is_file():
|
||||||
|
p.error(f"SMB CSV not found: {args.smb_csv}")
|
||||||
|
if args.nfs_csv and not Path(args.nfs_csv).is_file():
|
||||||
|
p.error(f"NFS CSV not found: {args.nfs_csv}")
|
||||||
|
|
||||||
if not args.dest:
|
if not args.dest:
|
||||||
p.error("--dest is required (or use --list-archive to inspect the archive).")
|
p.error("--dest is required.")
|
||||||
if not args.api_key:
|
if not args.api_key:
|
||||||
p.error("--api-key is required.")
|
p.error("--api-key is required.")
|
||||||
|
|
||||||
|
|||||||
209
truenas_migrate/csv_source.py
Normal file
209
truenas_migrate/csv_source.py
Normal file
@@ -0,0 +1,209 @@
|
|||||||
|
"""CSV source parser – reads SMB/NFS share definitions from customer-supplied CSV files."""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import csv
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from .colors import log
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Column name mappings (human-readable header → API field name)
|
||||||
|
# Both the friendly names and the raw API names are accepted.
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
_SMB_COL_MAP: dict[str, str] = {
|
||||||
|
"share name": "name",
|
||||||
|
"path": "path",
|
||||||
|
"description": "comment",
|
||||||
|
"purpose": "purpose",
|
||||||
|
"read only": "ro",
|
||||||
|
"browsable": "browsable",
|
||||||
|
"guest access": "guestok",
|
||||||
|
"access-based enumeration": "abe",
|
||||||
|
"hosts allow": "hostsallow",
|
||||||
|
"hosts deny": "hostsdeny",
|
||||||
|
"time machine": "timemachine",
|
||||||
|
"enabled": "enabled",
|
||||||
|
}
|
||||||
|
|
||||||
|
_NFS_COL_MAP: dict[str, str] = {
|
||||||
|
"path": "path",
|
||||||
|
"description": "comment",
|
||||||
|
"read only": "ro",
|
||||||
|
"map root user": "maproot_user",
|
||||||
|
"map root group": "maproot_group",
|
||||||
|
"map all user": "mapall_user",
|
||||||
|
"map all group": "mapall_group",
|
||||||
|
"security": "security",
|
||||||
|
"allowed hosts": "hosts",
|
||||||
|
"allowed networks": "networks",
|
||||||
|
"enabled": "enabled",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Column type metadata (keyed by API field name)
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Columns coerced to bool
|
||||||
|
_SMB_BOOL_COLS = frozenset({"ro", "browsable", "guestok", "abe", "timemachine", "enabled"})
|
||||||
|
# Columns coerced to list[str] (space-or-comma-separated in CSV)
|
||||||
|
_SMB_LIST_COLS = frozenset({"hostsallow", "hostsdeny"})
|
||||||
|
_SMB_REQUIRED = frozenset({"name", "path"})
|
||||||
|
|
||||||
|
_NFS_BOOL_COLS = frozenset({"ro", "enabled"})
|
||||||
|
_NFS_LIST_COLS = frozenset({"security", "hosts", "networks"})
|
||||||
|
_NFS_REQUIRED = frozenset({"path"})
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Internal helpers
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _parse_bool(value: str, col: str, row_num: int) -> bool:
|
||||||
|
v = value.strip().lower()
|
||||||
|
if v in ("true", "yes", "1"):
|
||||||
|
return True
|
||||||
|
if v in ("false", "no", "0", ""):
|
||||||
|
return False
|
||||||
|
log.warning(" row %d: unrecognised boolean %r for column %r – treating as False",
|
||||||
|
row_num, value, col)
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_list(value: str) -> list[str]:
|
||||||
|
"""Split space-or-comma-separated value into a list, dropping blanks."""
|
||||||
|
return [p for p in value.replace(",", " ").split() if p]
|
||||||
|
|
||||||
|
|
||||||
|
def _coerce_row(
|
||||||
|
row: dict[str, str],
|
||||||
|
bool_cols: frozenset[str],
|
||||||
|
list_cols: frozenset[str],
|
||||||
|
required: frozenset[str],
|
||||||
|
row_num: int,
|
||||||
|
) -> dict[str, Any] | None:
|
||||||
|
"""Validate and type-coerce one CSV row. Returns None to skip the row."""
|
||||||
|
if not any((v or "").strip() for v in row.values()):
|
||||||
|
return None # blank row
|
||||||
|
|
||||||
|
first_val = next(iter(row.values()), "") or ""
|
||||||
|
if first_val.strip().startswith("#"):
|
||||||
|
return None # comment row
|
||||||
|
|
||||||
|
result: dict[str, Any] = {}
|
||||||
|
for col, raw in row.items():
|
||||||
|
if col is None:
|
||||||
|
continue
|
||||||
|
col = col.strip()
|
||||||
|
val = (raw or "").strip()
|
||||||
|
|
||||||
|
if not val:
|
||||||
|
continue # omit empty optional fields; API uses its defaults
|
||||||
|
|
||||||
|
if col in bool_cols:
|
||||||
|
result[col] = _parse_bool(val, col, row_num)
|
||||||
|
elif col in list_cols:
|
||||||
|
result[col] = _parse_list(val)
|
||||||
|
else:
|
||||||
|
result[col] = val
|
||||||
|
|
||||||
|
for req in required:
|
||||||
|
if req not in result:
|
||||||
|
log.warning(" row %d: missing required field %r – skipping row", row_num, req)
|
||||||
|
return None
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def _normalize_col(col: str, col_map: dict[str, str]) -> str:
|
||||||
|
"""Map a header name to its API field name; falls back to the lowercased original."""
|
||||||
|
key = col.strip().lower()
|
||||||
|
return col_map.get(key, key)
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_csv(
|
||||||
|
csv_path: str,
|
||||||
|
bool_cols: frozenset[str],
|
||||||
|
list_cols: frozenset[str],
|
||||||
|
required: frozenset[str],
|
||||||
|
col_map: dict[str, str],
|
||||||
|
label: str,
|
||||||
|
) -> list[dict]:
|
||||||
|
path = Path(csv_path)
|
||||||
|
if not path.is_file():
|
||||||
|
log.error("%s CSV file not found: %s", label, csv_path)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
shares: list[dict] = []
|
||||||
|
try:
|
||||||
|
with path.open(newline="", encoding="utf-8-sig") as fh:
|
||||||
|
reader = csv.DictReader(fh)
|
||||||
|
if reader.fieldnames is None:
|
||||||
|
log.error("%s CSV has no header row: %s", label, csv_path)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
# Normalise header names using the column map
|
||||||
|
normalised_header = {
|
||||||
|
_normalize_col(c, col_map)
|
||||||
|
for c in reader.fieldnames if c is not None
|
||||||
|
}
|
||||||
|
missing_req = required - normalised_header
|
||||||
|
if missing_req:
|
||||||
|
log.error(
|
||||||
|
"%s CSV is missing required column(s): %s",
|
||||||
|
label, ", ".join(sorted(missing_req)),
|
||||||
|
)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
for row_num, row in enumerate(reader, start=2):
|
||||||
|
normalised = {
|
||||||
|
_normalize_col(k, col_map): v
|
||||||
|
for k, v in row.items() if k is not None
|
||||||
|
}
|
||||||
|
share = _coerce_row(normalised, bool_cols, list_cols, required, row_num)
|
||||||
|
if share is not None:
|
||||||
|
shares.append(share)
|
||||||
|
|
||||||
|
except OSError as exc:
|
||||||
|
log.error("Cannot read %s CSV: %s", label, exc)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
log.info(" %-14s → %s (%d share(s))", label.lower() + "_shares", csv_path, len(shares))
|
||||||
|
return shares
|
||||||
|
|
||||||
|
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
# Public API
|
||||||
|
# ─────────────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def parse_smb_csv(csv_path: str) -> list[dict]:
|
||||||
|
"""Parse an SMB shares CSV. Returns share dicts compatible with migrate.py."""
|
||||||
|
return _parse_csv(csv_path, _SMB_BOOL_COLS, _SMB_LIST_COLS, _SMB_REQUIRED, _SMB_COL_MAP, "SMB")
|
||||||
|
|
||||||
|
|
||||||
|
def parse_nfs_csv(csv_path: str) -> list[dict]:
|
||||||
|
"""Parse an NFS shares CSV. Returns share dicts compatible with migrate.py."""
|
||||||
|
return _parse_csv(csv_path, _NFS_BOOL_COLS, _NFS_LIST_COLS, _NFS_REQUIRED, _NFS_COL_MAP, "NFS")
|
||||||
|
|
||||||
|
|
||||||
|
def parse_csv_sources(smb_csv: str | None, nfs_csv: str | None) -> dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Parse one or both CSV files.
|
||||||
|
Returns {"smb_shares": list, "nfs_shares": list} — same shape as parse_archive().
|
||||||
|
"""
|
||||||
|
log.info("Loading shares from CSV source(s).")
|
||||||
|
result: dict[str, Any] = {"smb_shares": [], "nfs_shares": []}
|
||||||
|
if smb_csv:
|
||||||
|
result["smb_shares"] = parse_smb_csv(smb_csv)
|
||||||
|
if nfs_csv:
|
||||||
|
result["nfs_shares"] = parse_nfs_csv(nfs_csv)
|
||||||
|
log.info(
|
||||||
|
"Loaded: %d SMB share(s), %d NFS share(s)",
|
||||||
|
len(result["smb_shares"]),
|
||||||
|
len(result["nfs_shares"]),
|
||||||
|
)
|
||||||
|
return result
|
||||||
Reference in New Issue
Block a user