Evidence Collection & Reporting

Systematic documentation for OSCP exam success

Screenshot Requirements

Mandatory Screenshots for Each Machine

Initial Discovery

  • Nmap scan results showing open ports
  • Service enumeration results (HTTP, SMB, etc.)
  • AutoRecon summary or equivalent

Exploitation

  • Vulnerability identification (web app, service, etc.)
  • Exploit execution showing successful attack
  • Initial shell/access obtained

Privilege Escalation

  • User flag capture (type user.txt or cat user.txt)
  • Privilege escalation vector identification
  • Root/Administrator flag capture (type proof.txt or cat proof.txt)

System Information

  • System hostname and IP verification
  • User privileges (whoami and whoami /priv or id)
  • Operating system information (systeminfo or uname -a)

Proof Collection Commands

Windows Targets

# System information
hostname
whoami
whoami /priv
systeminfo | findstr /B /C:"Host Name" /C:"OS Name" /C:"OS Version"

# Flag collection
type C:\Users\%USERNAME%\Desktop\user.txt
type C:\Users\Administrator\Desktop\proof.txt

# Alternate flag locations
dir C:\Users\*\Desktop\*.txt /s
type "C:\Documents and Settings\Administrator\Desktop\proof.txt"

Linux Targets

# System information
hostname
whoami
id
uname -a
cat /etc/os-release

# Flag collection
cat /home/*/user.txt
cat /root/proof.txt

# Alternate flag locations
find / -name "user.txt" 2>/dev/null
find / -name "proof.txt" 2>/dev/null

Evidence Documentation Template

Per-Machine Documentation Structure

Machine: [NAME/IP]

OS: [Windows/Linux]
Difficulty: [Easy/Medium/Hard]
Points: [10/20/25]

1. Enumeration Summary

Open Ports:
- Port X: Service Y
- Port Z: Service A

Key Findings:
- Finding 1
- Finding 2
- Finding 3

2. Initial Access

Vulnerability: [CVE/Description]
Exploitation Method: [Command/Tool used]
Proof Screenshot: [filename.png]

3. User Flag

Location: [file path]
Content: [flag content]
Screenshot: [filename.png]

4. Privilege Escalation

Vector: [Method used]
Exploitation: [Commands/Tools]
Proof Screenshot: [filename.png]

5. Administrator/Root Flag

Location: [file path]  
Content: [flag content]
Screenshot: [filename.png]

Screenshot Organization

File Naming Convention

Format: [MACHINE]_[STEP]_[DESCRIPTION].png

Examples:
- 192.168.1.10_01_nmap_scan.png
- 192.168.1.10_02_web_enum.png
- 192.168.1.10_03_sqli_exploit.png
- 192.168.1.10_04_user_flag.png
- 192.168.1.10_05_privesc_vector.png
- 192.168.1.10_06_root_flag.png

Directory Structure

exam_evidence/
β”œβ”€β”€ machine_1_10.10.10.10/
β”‚   β”œβ”€β”€ screenshots/
β”‚   β”œβ”€β”€ nmap_results/
β”‚   β”œβ”€β”€ exploits/
β”‚   └── flags/
β”œβ”€β”€ machine_2_10.10.10.11/
β”‚   β”œβ”€β”€ screenshots/
β”‚   β”œβ”€β”€ nmap_results/
β”‚   β”œβ”€β”€ exploits/
β”‚   └── flags/
└── report/
    β”œβ”€β”€ screenshots/
    └── report.pdf

Critical Evidence Checklist

For Each Flag (User & Root/Admin)

Before Flag Capture

  • Verify correct machine (hostname/IP)
  • Confirm user context (whoami output)
  • Document privilege level

During Flag Capture

  • Show file path clearly in screenshot
  • Display flag content fully visible
  • Include command prompt showing user context
  • Take multiple screenshots if needed

After Flag Capture

  • Verify flag format (correct length/format)
  • Copy flag content to notes
  • Mark timestamp of capture

Evidence Quality Standards

Screenshot Requirements

  • High resolution (readable text)
  • Full terminal window visible
  • Command and output both visible
  • No cropping of important information
  • Clear timestamps when relevant

Command Documentation

# Always include full commands used
nmap -sC -sV -p- 10.10.10.10 -oA full_scan

# Include working directory context  
pwd
ls -la

# Show command success/failure
echo $?

Proof Verification

  • Flag content exactly as displayed
  • File permissions shown (ls -la)
  • File ownership verified
  • System context clearly demonstrated

Common Documentation Mistakes

Insufficient Evidence

  • Missing hostname verification
  • Cropped screenshots hiding important info
  • No command context (just output shown)
  • Wrong user context (not showing who executed)

Technical Issues

  • Blurry screenshots (unreadable text)
  • Missing timestamps when required
  • Incorrect flag content (copy/paste errors)
  • Missing system information context

Organizational Problems

  • Poor file naming (can’t identify which machine)
  • Missing screenshots for critical steps
  • Inconsistent documentation between machines
  • No backup of evidence

Report Writing Guidelines

Report Structure (Follow OffSec Template)

  1. Executive Summary
  2. Methodology
  3. Machine 1 - [Name]
  4. Machine 2 - [Name]
  5. Machine 3 - [Name]
  6. Additional Items (if any)
  7. Conclusion

Per-Machine Report Sections

  1. Information Gathering
  2. Enumeration
  3. Exploitation
  4. Post-Exploitation
  5. Privilege Escalation
  6. Proof

Writing Standards

  • Technical accuracy (exact commands/outputs)
  • Clear explanations (why each step was taken)
  • Proper screenshots referenced in text
  • Vulnerability classification (if applicable)
  • Remediation suggestions (if required)

Quick Evidence Commands

System Information Collection

#!/bin/bash
# Quick system info collection script
 
echo "=== SYSTEM INFORMATION ==="
hostname
date
whoami
id
 
echo "=== NETWORK INFORMATION ==="  
ip addr show
ip route show
 
echo "=== OS INFORMATION ==="
cat /etc/os-release 2>/dev/null || systeminfo

Windows Evidence Collection

@echo off
echo === SYSTEM INFORMATION ===
hostname
date /t
time /t
whoami
whoami /priv
 
echo === NETWORK INFORMATION ===
ipconfig /all
route print
 
echo === OS INFORMATION ===
systeminfo

Backup and Security

Evidence Backup

  • Multiple copies of all evidence
  • Cloud backup (encrypted)
  • Local backup (external drive)
  • Hash verification of critical files

Data Security

  • Encrypt sensitive data before storage
  • No real credentials in documentation
  • Sanitize outputs of personal information
  • Secure deletion after submission

Version Control

  • Track changes to documentation
  • Timestamp versions clearly
  • Maintain original screenshots
  • Document modifications made

Time-Saving Tips

During Exam

  • Screenshot immediately when something works
  • Don’t wait for perfect screenshots
  • Basic notes sufficient during exam
  • Focus on flags not perfect documentation

Automation Scripts

# Auto-screenshot with timestamp
screenshot() {
    scrot "$(date +%Y%m%d_%H%M%S)_$1.png"
}
 
# Auto-copy important outputs
save_output() {
    $1 | tee "$(date +%Y%m%d_%H%M%S)_output.txt"
}

Post-Exam Optimization

  • Start report within 12 hours
  • Use templates for consistency
  • Batch process screenshots
  • Review before final submission

Submission Checklist

Pre-Submission Review

  • All flags documented with proof
  • Screenshots clearly readable
  • Report format follows OffSec template
  • File size within limits
  • Plagiarism check completed

Final Verification

  • Point calculation verified (β‰₯70 points)
  • Machine count correct
  • Flag content accuracy double-checked
  • Evidence completeness confirmed

Submission Process

  • Export to PDF (final format)
  • Compress if needed (file size limits)
  • Submit before deadline
  • Confirmation email received
  • Backup submission kept

Report Template Snippet

Example Machine Section

### 10.10.10.10 - [Machine Name]
 
#### Information Gathering
The following ports were identified during the initial enumeration:
- Port 22: SSH
- Port 80: HTTP Apache 2.4.29
- Port 443: HTTPS
 
#### Enumeration
Web application enumeration revealed:
[Description of findings]
 
#### Exploitation  
The following vulnerability was identified and exploited:
[Vulnerability description and exploitation steps]
 
#### Proof
![User Flag](screenshots/machine1_user_flag.png)
User flag: [flag_content]
 
![Root Flag](screenshots/machine1_root_flag.png)  
Root flag: [flag_content]

Final Reminders

Quality Over Quantity

  • Clear evidence beats extensive documentation
  • Working proof more important than detailed explanations
  • Focus on requirements not perfect presentation

Time Management

  • Document as you go (don’t leave to end)
  • Screenshots first (detailed notes later)
  • Minimum viable documentation during exam

Success Metrics

  • All flags captured with proof
  • Evidence trail clearly documented
  • Report submission before deadline
  • β‰₯70 points achieved

Remember: Documentation proves you did the work. Make it count! πŸ“Έ