Skip to main content

Production: Python

Open Repo Download ZIP

git clone https://github.com/AEEF-AI/aeef-production.git

The Production tier for Python deploys a Dockerized FastAPI application with Celery for asynchronous task processing, comprehensive CI workflows, drift detection, incident response automation, and monitoring integration.

Full Platform Walkthrough

Application Architecture

production-python/
src/
api/ # FastAPI route handlers
core/ # Business logic and domain models
tasks/ # Celery async task definitions
middleware/ # Request validation, auth, structured logging
monitoring/ # Health checks, Prometheus metrics exporter
tests/
unit/ # pytest unit tests
integration/ # API contract tests with httpx
e2e/ # End-to-end scenario tests
load/ # Locust load test scripts
docker/
Dockerfile # Multi-stage production build
Dockerfile.dev # Development with hot reload
Dockerfile.worker # Celery worker image
.github/
workflows/
ci.yml # Full 10-stage CI pipeline
drift.yml # Scheduled drift detection
incident.yml # Incident response automation
docker-compose.yml
docker-compose.monitoring.yml

Dockerized Deployment

The multi-stage Dockerfile:

# Build stage
FROM python:3.12-slim AS builder
WORKDIR /app
RUN pip install uv
COPY pyproject.toml uv.lock ./
RUN uv sync --frozen --no-dev

# Production stage
FROM python:3.12-slim AS runner
WORKDIR /app
RUN groupadd --system aeef && useradd --system --gid aeef app
COPY --from=builder /app/.venv /app/.venv
COPY src/ ./src/
USER app
ENV PATH="/app/.venv/bin:$PATH"
EXPOSE 8000
CMD ["uvicorn", "src.api.main:app", "--host", "0.0.0.0", "--port", "8000", "--workers", "4"]

Docker Compose Stack

services:
api:
build:
context: .
dockerfile: docker/Dockerfile
ports: ["8000:8000"]
environment:
- AEEF_OVERLAY=ksa
- CELERY_BROKER_URL=redis://redis:6379/0
depends_on: [redis, postgres]

worker:
build:
context: .
dockerfile: docker/Dockerfile.worker
environment:
- CELERY_BROKER_URL=redis://redis:6379/0
depends_on: [redis]

redis:
image: redis:7-alpine

postgres:
image: postgres:16-alpine
environment:
POSTGRES_DB: aeef
POSTGRES_PASSWORD_FILE: /run/secrets/db_password

CI Pipeline

The Production Python pipeline extends the Transformation pipeline to 10 stages:

ruff-check --> ruff-format --> mypy --> pytest-unit --> mutmut -->
pytest-integration --> SAST --> SCA+license --> SBOM --> provenance

Integration Tests

- name: Integration Tests
run: |
docker compose up -d postgres redis
uv run pytest tests/integration/ -v --tb=short
env:
DATABASE_URL: postgresql://postgres:test@localhost:5432/aeef_test

SBOM Generation

- name: Generate SBOM
run: |
uv run cyclonedx-py environment --output sbom.json --format json

Load Tests (Optional Gate)

- name: Load Tests
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
run: |
docker compose up -d
uv run locust -f tests/load/locustfile.py \
--headless --users 100 --spawn-rate 10 --run-time 60s \
--host http://localhost:8000 --check-fail-ratio 0.01

Monitoring Integration

The FastAPI application exports Prometheus metrics via the prometheus-fastapi-instrumentator library:

from prometheus_fastapi_instrumentator import Instrumentator

app = FastAPI(title="AEEF Production API")
Instrumentator().instrument(app).expose(app)

Custom AEEF metrics:

from prometheus_client import Counter, Histogram

ai_contribution_counter = Counter(
"aeef_ai_contributions_total",
"Total AI-assisted code contributions",
["tool", "agent_role"],
)

quality_gate_duration = Histogram(
"aeef_quality_gate_seconds",
"Time spent in quality gate checks",
["gate_name"],
)

Drift Detection

The Python drift detection runs Ruff, mypy, and Semgrep config validation:

- name: Check Ruff Config Drift
run: |
uv run python scripts/drift_detect.py --category linting \
--baseline .aeef/baselines/ruff.json \
--current ruff.toml

- name: Check Type Strictness Drift
run: |
uv run python scripts/drift_detect.py --category typing \
--baseline .aeef/baselines/mypy.json \
--current mypy.ini

Celery Task Governance

Async tasks processed by Celery workers follow AEEF governance:

  • Task contracts define what each task can access and produce
  • Task monitoring exports Celery metrics to Prometheus (task duration, failure rate, queue depth)
  • Dead letter handling routes failed tasks to an incident review queue
@app.task(bind=True, max_retries=3, acks_late=True)
def process_kpi_record(self, record: dict) -> None:
"""Process a KPI record with AEEF governance."""
validate_schema(record, "kpi-record.schema.json")
# ... processing logic

Next Steps