Human Review Record Template

Informative Appendix (non-normative)

This appendix provides an illustrative template for documenting the human review gate required for critical findings and other high-consequence decisions in autonomous penetration testing workflows. It is intended to help platform operators, customers, and reviewers implement and verify the relevant APTS requirements consistently. It does not prescribe one mandatory format for all platforms.

Purpose

APTS already requires structured human review and approval records for higher-risk actions and critical findings. In practice, reviewers and customers benefit from a concrete template that shows what a complete human review record should contain.

This appendix shows:

Primary Use Cases

Use a human review record when the platform needs to document:

Design Principles

A human review record should:

1. Record metadata

Use stable identifiers so the review record can be correlated with the finding, engagement, and audit trail.

Recommended fields:

2. Finding summary

Capture enough context for the reviewer to understand what is being approved without relying on memory alone.

Recommended fields:

3. Reviewer identity and qualification

Document who performed the review and why they were authorized to do so.

Recommended fields:

4. Evidence and verification checks

Record what the reviewer actually examined.

Recommended fields:

5. Reviewer decision

Make the final human decision explicit and machine-readable.

Recommended fields:

Suggested decision values:

6. Approval proof and timestamps

Document when approval happened and what authenticated it.

Recommended fields:

Example YAML Template

review_record_id: rr-2026-0042
engagement_id: eng-2026-001
finding_id: finding-critical-007
report_version: 3
review_stage: pre-delivery-critical-review
created_at: 2026-04-20T09:00:00Z
last_updated_at: 2026-04-20T09:12:00Z

finding_summary:
  title: Authenticated remote code execution via plugin upload
  severity: Critical
  confidence_score: 96
  classification: confirmed
  affected_assets:
    - app.example.com
  finding_status_before_review: confirmed

reviewer:
  reviewer_name: Jane Doe
  reviewer_role: Senior Offensive Security Reviewer
  reviewer_contact: [email protected]
  reviewer_qualification:
    - OSCP
    - 8 years offensive security experience
  delegation_basis: ho-role-senior-reviewer

evidence_verification:
  evidence_artifacts_reviewed:
    - epm-2026-001#artifact-14
    - epm-2026-001#artifact-18
  reproduction_status: reproduced
  reverification_status: reproduced
  provenance_checked: true
  notes_on_evidence_quality: Raw request/response pair and shell output align with reported impact

review_decision:
  decision: approved
  delivery_disposition: include_in_main_report
  severity_adjustment: none
  required_follow_up:
    - include remediation note on plugin signature enforcement
  review_notes: Evidence and reproduction support the reported finding and severity

approval_proof:
  review_started_at: 2026-04-20T09:03:00Z
  review_completed_at: 2026-04-20T09:12:00Z
  approval_timestamp: 2026-04-20T09:12:00Z
  authentication_method: sso-mfa
  authentication_reference: session-9f8b2a
  signature_token: reviewer-approved-2026-0042

JSON-Equivalent Structure

{
  "review_record_id": "rr-2026-0042",
  "engagement_id": "eng-2026-001",
  "finding_id": "finding-critical-007",
  "report_version": 3,
  "review_stage": "pre-delivery-critical-review",
  "created_at": "2026-04-20T09:00:00Z",
  "last_updated_at": "2026-04-20T09:12:00Z",
  "finding_summary": {
    "title": "Authenticated remote code execution via plugin upload",
    "severity": "Critical",
    "confidence_score": 96,
    "classification": "confirmed",
    "affected_assets": ["app.example.com"],
    "finding_status_before_review": "confirmed"
  },
  "reviewer": {
    "reviewer_name": "Jane Doe",
    "reviewer_role": "Senior Offensive Security Reviewer",
    "reviewer_contact": "[email protected]",
    "reviewer_qualification": ["OSCP", "8 years offensive security experience"],
    "delegation_basis": "ho-role-senior-reviewer"
  },
  "evidence_verification": {
    "evidence_artifacts_reviewed": ["epm-2026-001#artifact-14", "epm-2026-001#artifact-18"],
    "reproduction_status": "reproduced",
    "reverification_status": "reproduced",
    "provenance_checked": true,
    "notes_on_evidence_quality": "Raw request/response pair and shell output align with reported impact"
  },
  "review_decision": {
    "decision": "approved",
    "delivery_disposition": "include_in_main_report",
    "severity_adjustment": "none",
    "required_follow_up": ["include remediation note on plugin signature enforcement"],
    "review_notes": "Evidence and reproduction support the reported finding and severity"
  },
  "approval_proof": {
    "review_started_at": "2026-04-20T09:03:00Z",
    "review_completed_at": "2026-04-20T09:12:00Z",
    "approval_timestamp": "2026-04-20T09:12:00Z",
    "authentication_method": "sso-mfa",
    "authentication_reference": "session-9f8b2a",
    "signature_token": "reviewer-approved-2026-0042"
  }
}

Reviewer Questions

When inspecting a human review record, ask:

This template can help operators and reviewers implement or verify:

Notes

This appendix is intentionally lightweight. Organizations may embed these fields into ticketing systems, approval dashboards, signed review forms, or case-management records as long as the resulting artifact remains auditable and cross-referenceable.