analyzing-cloud-storage-access-patterns
SKILL.md
Instructions
- Install dependencies:
pip install boto3 requests - Query CloudTrail for S3 Data Events using AWS CLI or boto3.
- Build access baselines: hourly request volume, per-user object counts, source IP history.
- Detect anomalies:
- After-hours access (outside 8am-6pm local time)
- Bulk downloads: >100 GetObject calls from single principal in 1 hour
- New source IPs not seen in the prior 30 days
- ListBucket enumeration spikes (reconnaissance indicator)
- Generate prioritized findings report.
python scripts/agent.py --bucket my-sensitive-data --hours-back 24 --output s3_access_report.json
Examples
CloudTrail S3 Data Event
{"eventName": "GetObject", "requestParameters": {"bucketName": "sensitive-data", "key": "financials/q4.xlsx"},
"sourceIPAddress": "203.0.113.50", "userIdentity": {"arn": "arn:aws:iam::123456789012:user/analyst"}}
Weekly Installs
3
Repository
mukul975/anthro…y-skillsGitHub Stars
873
First Seen
1 day ago
Security Audits
Installed on
amp3
cline3
opencode3
cursor3
kimi-cli3
codex3