Files
plex-playlist/Dockerfile.cicd
Cliff Hill e16141cdbd
Some checks failed
Tests / Build and Push CICD Base Image (push) Successful in 1m9s
Tests / Build and Push CICD Complete Image (push) Successful in 43m33s
Tests / Prettier Format Check (push) Successful in 1m1s
Tests / Trailing Whitespace Check (push) Successful in 46s
Tests / End of File Check (push) Successful in 39s
Tests / YAML Syntax Check (push) Successful in 33s
Tests / TOML Syntax Check (push) Successful in 42s
Tests / Mixed Line Ending Check (push) Successful in 49s
Tests / TOML Formatting Check (push) Successful in 31s
Tests / Ruff Linting (push) Successful in 32s
Tests / Ruff Format Check (push) Successful in 34s
Tests / Pyright Type Check (push) Successful in 1m12s
Tests / Darglint Docstring Check (push) Successful in 41s
Tests / No Docstring Types Check (push) Successful in 40s
Tests / ESLint Check (push) Successful in 1m13s
Tests / Backend Tests (push) Successful in 52s
Tests / Frontend Tests (push) Successful in 1m51s
Tests / Backend Doctests (push) Successful in 43s
Tests / TypeScript Type Check (push) Successful in 25m30s
Tests / End-to-End Tests (push) Failing after 16m11s
Tests / TSDoc Lint Check (push) Successful in 40m44s
Tests / Integration Tests (push) Successful in 38m2s
Better dependency processing for the frontend in the CICD.
Signed-off-by: Cliff Hill <xlorep@darkhelm.org>
2025-11-01 16:13:30 -04:00

282 lines
14 KiB
Docker
Raw Permalink Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
# CICD Complete Setup - Optimized Build Order for Maximum Caching
# OPTIMIZATION STRATEGY:
# Phase 1: Extract dependency files (package.json, pyproject.toml)
# Phase 2: Install dependencies (cached layer, only invalidates when deps change)
# Phase 3: Clone full source code (doesn't bust dependency cache)
# Phase 4-6: Install packages and verify (requires full source)
#
# BENEFITS: Dependency installation ~20-30 minutes is cached across source code changes
ARG CICD_BASE_IMAGE=dogar.darkhelm.org/darkhelm.org/plex-playlist/cicd-base:latest
FROM ${CICD_BASE_IMAGE}
# Build args for cache busting
ARG GITHUB_SHA
ENV GITHUB_SHA=${GITHUB_SHA}
# Accept build arguments for Git checkout (no secrets here!)
ARG GITHUB_SHA
# Set working directory
WORKDIR /workspace
# OPTIMIZATION: Extract dependency files first for better layer caching
# Step 1: Clone repository minimally to get dependency files only
RUN --mount=type=secret,id=ssh_private_key \
mkdir -p ~/.ssh && \
cp /run/secrets/ssh_private_key ~/.ssh/id_rsa && \
chmod 600 ~/.ssh/id_rsa && \
echo "Host dogar.darkhelm.org" > ~/.ssh/config && \
echo " Port 2222" >> ~/.ssh/config && \
echo " StrictHostKeyChecking no" >> ~/.ssh/config && \
echo " UserKnownHostsFile /dev/null" >> ~/.ssh/config && \
chmod 600 ~/.ssh/config && \
ssh-keyscan -p 2222 dogar.darkhelm.org >> ~/.ssh/known_hosts 2>/dev/null && \
echo "=== Extracting dependency files for optimized caching ===" && \
GIT_SSH_COMMAND="ssh -F ~/.ssh/config" \
git clone --depth 1 --branch main \
ssh://git@dogar.darkhelm.org:2222/DarkHelm.org/plex-playlist.git /tmp/repo && \
if [ -n "$GITHUB_SHA" ]; then \
cd /tmp/repo && git checkout "$GITHUB_SHA" 2>/dev/null || echo "Using main branch HEAD"; \
fi && \
# Extract only dependency files for caching optimization
mkdir -p /workspace/backend /workspace/frontend && \
cp /tmp/repo/backend/pyproject.toml /workspace/backend/ 2>/dev/null || echo "No backend pyproject.toml" && \
cp /tmp/repo/frontend/package.json /workspace/frontend/ 2>/dev/null || echo "No frontend package.json" && \
cp /tmp/repo/frontend/yarn.lock /workspace/frontend/ 2>/dev/null || echo "No frontend yarn.lock" && \
cp /tmp/repo/.pre-commit-config.yaml /workspace/ 2>/dev/null || echo "No pre-commit config" && \
echo "✓ Dependency files extracted for optimized layer caching" && \
rm -rf /tmp/repo ~/.ssh
# OPTIMIZATION PHASE 1: Install backend dependencies from extracted pyproject.toml
WORKDIR /workspace/backend
ENV VIRTUAL_ENV=/workspace/backend/.venv
# Install backend dependencies first (before source code) for better caching
RUN echo "=== Installing Backend Dependencies (Phase 1: Optimized Caching) ===" && \
# Create project virtual environment
uv venv $VIRTUAL_ENV && \
# Check if base image optimization is available
echo "=== Base Image Optimization Status ===" && \
if [ -f "/opt/python-dev-tools/bin/python" ]; then \
echo "✓ Found pre-installed Python dev tools - leveraging cache" && \
uv pip list --python /opt/python-dev-tools/bin/python --format=freeze > /tmp/base-tools.txt && \
echo "Available pre-installed tools:" && \
head -10 /tmp/base-tools.txt; \
else \
echo "⚠ Pre-installed Python dev tools not found - fresh installation" && \
echo "Base image may need rebuild for optimal caching"; \
fi && \
# Install dependencies from extracted pyproject.toml (this layer will cache!)
if [ -f "pyproject.toml" ]; then \
echo "Installing project dependencies from pyproject.toml..." && \
# Create minimal source structure to satisfy package build requirements
mkdir -p src/backend && \
echo "# Temporary README for dependency caching phase" > ../README.md && \
echo "# Minimal __init__.py for build" > src/backend/__init__.py && \
# Install all dependencies including dev dependencies
uv sync --dev && \
echo "✓ Backend dependencies installed and cached"; \
else \
echo "No pyproject.toml found, skipping dependency installation"; \
fi
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
# OPTIMIZATION PHASE 2: Install frontend dependencies from extracted package.json
WORKDIR /workspace/frontend
# Setup frontend environment and install dependencies (before source code) for better caching
RUN echo "=== Installing Frontend Dependencies (Phase 2: Optimized Caching) ===" && \
echo "Available global tools (installed via npm):" && \
npm list -g --depth=0 2>/dev/null | head -10 || echo "Global npm tools available" && \
which tsc && which eslint && which prettier || echo "Global tools verified" && \
# Create temporary swap file for memory-intensive yarn install
dd if=/dev/zero of=/tmp/swapfile bs=1M count=1024 2>/dev/null && \
mkswap /tmp/swapfile && \
swapon /tmp/swapfile || echo "Swap setup failed, continuing without swap"
# Install frontend dependencies from extracted package.json (this layer will cache!)
RUN if [ -f "package.json" ]; then \
echo "Installing frontend dependencies from extracted package.json..." && \
export NODE_OPTIONS="--max-old-space-size=1024 --max-semi-space-size=64" && \
export UV_WORKERS=1 && \
echo "Memory info before install:" && \
free -h || true && \
INSTALL_SUCCESS=false && \
for i in 1 2 3; do \
echo "Attempt $i: Installing project-specific frontend dependencies..." && \
echo "(Common dev tools pre-installed globally for performance)" && \
timeout 2400 yarn install --immutable --mode=skip-build \
&& { INSTALL_SUCCESS=true; break; } || \
(echo "Attempt $i failed, cleaning up and retrying..." && \
rm -rf node_modules .yarn/cache .yarn/install-state.gz && \
yarn cache clean --all 2>/dev/null || true && \
sleep 60); \
done && \
rm -rf .yarn/cache && \
swapoff /tmp/swapfile 2>/dev/null || true && \
rm -f /tmp/swapfile && \
if [ "$INSTALL_SUCCESS" = "false" ]; then \
echo "WARNING: Frontend dependencies installation failed after 3 attempts"; \
echo "Continuing without frontend dependencies for CI/CD environment"; \
touch .frontend-deps-failed; \
else \
echo "✓ Frontend dependencies installed and cached"; \
fi; \
else \
echo "No package.json found, skipping frontend dependencies"; \
fi
# OPTIMIZATION PHASE 3: Now clone full source code (dependencies already cached above)
WORKDIR /workspace
RUN --mount=type=secret,id=ssh_private_key \
echo "=== Cloning Full Source Code (Phase 3: After Dependencies) ===" && \
mkdir -p ~/.ssh && \
cp /run/secrets/ssh_private_key ~/.ssh/id_rsa && \
chmod 600 ~/.ssh/id_rsa && \
echo "Host dogar.darkhelm.org" > ~/.ssh/config && \
echo " Port 2222" >> ~/.ssh/config && \
echo " StrictHostKeyChecking no" >> ~/.ssh/config && \
echo " UserKnownHostsFile /dev/null" >> ~/.ssh/config && \
chmod 600 ~/.ssh/config && \
ssh-keyscan -p 2222 dogar.darkhelm.org >> ~/.ssh/known_hosts 2>/dev/null && \
# Clone full repository (dependencies already installed, this won't bust cache layers)
GIT_SSH_COMMAND="ssh -F ~/.ssh/config" \
git clone --depth 1 --branch main \
ssh://git@dogar.darkhelm.org:2222/DarkHelm.org/plex-playlist.git /tmp/fullrepo && \
if [ -n "$GITHUB_SHA" ]; then \
cd /tmp/fullrepo && git checkout "$GITHUB_SHA" 2>/dev/null || echo "Using main branch HEAD"; \
fi && \
# Copy source code while preserving installed dependencies
echo "Copying source code while preserving installed dependencies..." && \
# Instead of backup/restore, copy selectively to avoid overwriting dependencies
echo "Source files in repo:" && \
ls -la /tmp/fullrepo/ && \
echo "Current workspace state:" && \
find /workspace -name "node_modules" -o -name ".venv" -o -name ".yarn" && \
# Copy source files excluding dependency directories
echo "Copying source files (excluding dependencies)..." && \
# Copy all files and directories except the ones we want to preserve
for item in /tmp/fullrepo/*; do \
basename_item=$(basename "$item"); \
target_path="/workspace/$basename_item"; \
if [ "$basename_item" = "backend" ] && [ -d "/workspace/backend/.venv" ]; then \
echo "Copying backend files while preserving .venv..."; \
# Copy backend files but skip .venv if it exists
find "$item" -mindepth 1 -maxdepth 1 ! -name ".venv" -exec cp -rf {} /workspace/backend/ \;; \
elif [ "$basename_item" = "frontend" ] && ([ -d "/workspace/frontend/node_modules" ] || [ -f "/workspace/frontend/.pnp.cjs" ]); then \
echo "Copying frontend files (will regenerate Yarn state after)..."; \
# Copy all frontend files normally - we'll regenerate Yarn state afterward
cp -rf "$item"/* /workspace/frontend/; \
else \
echo "Copying $basename_item..."; \
cp -rf "$item" /workspace/; \
fi; \
done && \
# Copy hidden files from root (like .gitignore, .dockerignore, etc.)
cp -rf /tmp/fullrepo/.* /workspace/ 2>/dev/null || true && \
# Verify dependencies are still there
echo "Final dependency check:" && \
find /workspace -name "node_modules" -o -name ".venv" -o -name ".yarn" && \
echo "✓ Full source code copied, dependencies preserved" && \
rm -rf /tmp/fullrepo ~/.ssh
# PHASE 3.5: Regenerate Yarn PnP state after source code update
WORKDIR /workspace/frontend
RUN if [ -f "package.json" ] && [ -f ".pnp.cjs" ]; then \
echo "=== Regenerating Yarn PnP State After Source Code Update ===" && \
echo "Source package.json and installed dependencies may have differences..." && \
yarn install --immutable && \
echo "✓ Yarn PnP state regenerated successfully - tools should now work"; \
else \
echo " No Yarn PnP setup detected, skipping state regeneration"; \
fi
# PHASE 4: Install backend package in development mode (requires full source)
WORKDIR /workspace/backend
RUN echo "=== Installing Backend Package in Development Mode ===" && \
uv pip install -e . && \
echo "✓ Backend package installed in development mode"
# PHASE 5: Install pre-commit environments (requires full source with .pre-commit-config.yaml)
WORKDIR /workspace
RUN echo "=== Installing Pre-commit Hook Environments ===" && \
if [ -f ".pre-commit-config.yaml" ]; then \
# Use project's Python environment for pre-commit
cd backend && uv run pre-commit install-hooks && \
echo "✓ Pre-commit hook environments installed successfully"; \
else \
echo "No .pre-commit-config.yaml found, skipping hook installation"; \
fi
# PHASE 6: Playwright browsers optimization check (may be pre-installed in base image)
WORKDIR /workspace/frontend
RUN if [ -f ".frontend-deps-failed" ]; then \
echo "Frontend dependencies failed - Playwright E2E tests will be skipped"; \
elif [ -f "package.json" ] && grep -q '@playwright/test' package.json && [ -d "node_modules" ]; then \
echo "Checking Playwright browser optimization status..." && \
# Check if Playwright CLI is available via yarn (from project dependencies)
if yarn playwright --version >/dev/null 2>&1; then \
echo "✓ Playwright CLI available via yarn" && \
# Check if browsers are pre-installed from base image
if yarn playwright install --dry-run >/dev/null 2>&1; then \
echo "✓ Playwright browsers available from base image optimization"; \
else \
echo "⚠ Playwright browsers not pre-installed - will install on demand in CI"; \
fi; \
else \
echo "⚠ Playwright CLI not available - E2E setup will be handled in CI"; \
fi && \
echo "✓ Playwright environment checked"; \
else \
echo " No Playwright tests configured in this project"; \
fi
# Verify all tools are working with the project
RUN cd /workspace/backend && \
echo "=== Backend Tools Verification ===" && \
uv run ruff --version && \
uv run pyright --version && \
uv run darglint --version && \
uv run pytest --version && \
uv run yamllint --version && \
uv run toml-sort --version && \
uv run xdoctest --version && \
uv run pre-commit --version
RUN cd /workspace/frontend && \
echo "=== Frontend Tools Verification ===" && \
if [ -f ".frontend-deps-failed" ]; then \
echo "WARNING: Skipping frontend tool verification due to failed dependencies installation"; \
echo "Frontend CI/CD jobs may be limited in this environment"; \
elif [ -d "node_modules" ] || [ -f ".pnp.cjs" ]; then \
if [ -f ".pnp.cjs" ]; then \
echo "✓ Yarn PnP dependencies found (.pnp.cjs), verifying tools..."; \
else \
echo "✓ Traditional node_modules found, verifying tools..."; \
fi && \
yarn eslint --version && \
yarn prettier --version && \
yarn tsc --version && \
yarn vitest --version && \
echo "✓ All frontend tools verified successfully"; \
else \
echo "ERROR: No frontend dependencies found (neither node_modules nor .pnp.cjs)"; \
echo "Available files in frontend:"; \
ls -la .; \
exit 1; \
fi
# Set Python path for backend
ENV PYTHONPATH=/workspace/backend/src:/workspace/backend
# Make global development tools available in PATH for fallback
ENV PATH="/opt/python-dev-tools/bin:$PATH"
# Set working directory back to root
WORKDIR /workspace
# Default to bash
SHELL ["/bin/bash", "-c"]
CMD ["/bin/bash"]