AI-powered Docker security scanner that explains vulnerabilities in plain English
Quick Start • Features • Installation • Usage • Contributing
🏆 Officially recognized as an OWASP Incubator Project
Trusted by the global security community • 14,000+ downloads
DockSec is an OWASP Incubator Project that combines traditional Docker security scanners (Trivy, Hadolint, Docker Scout) with AI to provide context-aware security analysis. Instead of dumping 200 CVEs and leaving you to figure it out, DockSec:
Think of it as having a security expert review your Dockerfiles.
Being recognized as an OWASP Incubator Project means:
Join thousands of developers using DockSec to secure their containers.
DockSec workflow: From scanning to actionable insights
DockSec follows a simple pipeline:
# Install
pip install docksec
# Scan your Dockerfile
docksec Dockerfile
# Scan with image analysis
docksec Dockerfile -i myapp:latest
# Scan without AI (no API key needed)
docksec Dockerfile --scan-only
--scan-only modeRequirements: Python 3.12+, Docker (for image scanning)
pip install docksec
For AI features, choose your preferred LLM provider:
export OPENAI_API_KEY="your-key-here"
export ANTHROPIC_API_KEY="your-key-here"
export LLM_PROVIDER="anthropic"
export LLM_MODEL="claude-3-5-sonnet-20241022"
export GOOGLE_API_KEY="your-key-here"
export LLM_PROVIDER="google"
export LLM_MODEL="gemini-1.5-pro"
# First, install and run Ollama: https://ollama.ai
# Then pull a model: ollama pull llama3.1
export LLM_PROVIDER="ollama"
export LLM_MODEL="llama3.1"
# Optional: customize Ollama URL
export OLLAMA_BASE_URL="http://localhost:11434"
External tools (optional, for full scanning):
# Install Trivy and Hadolint
python -m docksec.setup_external_tools
# Or install manually:
# - Trivy: https://aquasecurity.github.io/trivy/
# - Hadolint: https://github.com/hadolint/hadolint
# Analyze Dockerfile with AI recommendations
docksec Dockerfile
# Scan Dockerfile + Docker image
docksec Dockerfile -i nginx:latest
# Get only scan results (no AI)
docksec Dockerfile --scan-only
# Scan image without Dockerfile
docksec --image-only -i nginx:latest
# Use specific LLM provider and model
docksec Dockerfile --provider anthropic --model claude-3-5-sonnet-20241022
# Use local Ollama model
docksec Dockerfile --provider ollama --model llama3.1
| Option | Description |
|---|---|
dockerfile |
Path to Dockerfile |
-i, --image |
Docker image to scan |
-o, --output |
Output file path |
--provider |
LLM provider (openai, anthropic, google, ollama) |
--model |
Model name (e.g., gpt-4o, claude-3-5-sonnet-20241022) |
--ai-only |
AI analysis only (no scanning) |
--scan-only |
Scanning only (no AI) |
--image-only |
Scan image without Dockerfile |
Create a .env file for advanced configuration:
# LLM Provider Configuration
LLM_PROVIDER=openai # Options: openai, anthropic, google, ollama
LLM_MODEL=gpt-4o # Model to use
LLM_TEMPERATURE=0.0 # Temperature (0-1)
# API Keys
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key
GOOGLE_API_KEY=your-google-key
# Ollama Configuration (for local models)
OLLAMA_BASE_URL=http://localhost:11434
# Scanning Configuration
TRIVY_SCAN_TIMEOUT=600
DOCKSEC_DEFAULT_SEVERITY=CRITICAL,HIGH
See full configuration options.
🔍 Scanning Dockerfile...
⚠️ Security Score: 45/100
Critical Issues (3):
• Running as root user (line 12)
• Hardcoded API key detected (line 23)
• Using vulnerable base image
💡 AI Recommendations:
1. Add non-root user: RUN useradd -m appuser && USER appuser
2. Move secrets to environment variables or build secrets
3. Update FROM ubuntu:20.04 to ubuntu:22.04 (fixes 12 CVEs)
📊 Full report: results/nginx_latest_report.html
Dockerfile → [Trivy + Hadolint + Scout] → AI Analysis → Reports
DockSec runs security scanners locally, then uses AI to:
Supported AI Providers:
All scanning happens on your machine. Only scan results (not your code) are sent to the AI provider when using AI features.
See open issues or suggest features in discussions.
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
Quick links:
“No OpenAI API Key provided”
→ Set appropriate API key for your provider (OPENAI_API_KEY, ANTHROPIC_API_KEY, GOOGLE_API_KEY) or use --scan-only mode
“Unsupported LLM provider”
→ Valid providers: openai, anthropic, google, ollama. Set with --provider flag or LLM_PROVIDER env var
“Hadolint not found”
→ Run python -m docksec.setup_external_tools
“Python version not supported”
→ DockSec requires Python 3.12+. Use pyenv install 3.12 to upgrade.
“Connection refused” with Ollama
→ Make sure Ollama is running: ollama serve and the model is pulled: ollama pull llama3.1
“Where are my scan results?”
→ Results are saved to results/ directory in your DockSec installation
→ Customize location: export DOCKSEC_RESULTS_DIR=/custom/path
For more issues, see Troubleshooting Guide.
MIT License - see LICENSE for details.
DockSec is proud to be an OWASP Incubator Project, recognized by the Open Web Application Security Project for its contribution to application security.