Files
rocm-chatterbox-whisper/docker-compose.yml
scott b68bccb20f
Some checks failed
Build ROCm Image / build (push) Has been cancelled
Revert to torch 2.5.1 + ROCm 6.1 (known working combination)
PyTorch 2.11.0 with ROCm 7.2 wheels against rocm/dev-ubuntu-22.04:latest
causes MIOpen version mismatches that force every convolution onto a slow
zero-workspace fallback path (41s synthesis). The existing working project
uses torch 2.5.1 + ROCm 6.1 successfully on the same base image.

Also remove MIOPEN_FIND_ENFORCE override - unnecessary with matched versions.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-05 12:34:06 -04:00

42 lines
1.2 KiB
YAML

services:
chatterbox-whisper:
image: git.sdgarren.com/scott/rocm-chatterbox-whisper:latest
build:
context: .
dockerfile: Dockerfile.rocm
restart: unless-stopped
ports:
- "${WYOMING_PORT:-10200}:10200"
devices:
- /dev/kfd
- /dev/dri
group_add:
- video
- render
ipc: host
shm_size: 8g
security_opt:
- seccomp=unconfined
volumes:
- ./config.yaml:/app/config.yaml
- ./voices:/app/voices
- ./reference_audio:/app/reference_audio
- hf_cache:/app/hf_cache
- miopen_cache:/app/miopen_cache
environment:
- HF_HUB_ENABLE_HF_TRANSFER=1
# Persist MIOpen's kernel cache so it doesn't recompile on every start.
- MIOPEN_USER_DB_PATH=/app/miopen_cache
- MIOPEN_CUSTOM_CACHE_DIR=/app/miopen_cache
# HSA_OVERRIDE_GFX_VERSION is only needed for GPUs not officially supported
# by the ROCm version in the base image. Common values:
# 10.3.0 = RX 5000/6000 series
# 11.0.0 = RX 7000 series
# 9.0.6 = Vega
# - HSA_OVERRIDE_GFX_VERSION=10.3.0
# - HF_TOKEN=your_token_here
volumes:
hf_cache:
miopen_cache: