So I decided to write a tool since I couldn't find one online. This tool is pretty simple, it simply takes in an access key and secret key often used in configurations and connect scripts. It then goes one by one to each service and queries useful information such as Dynamo DB table names, S3 bucket names and how many objects are in each one.
So for example, let's say during your recon/OSINT phase you discover some AWS API creds exposed on gitleaks or some config file. (The screenshot obviously isn't a client, just a random entry in gitleaks)
You take the "Access Key" and "Secret Key" and pump them into the AWSEnumerator script:
$ ./AWSEnumerator.py AKXXXXXXXXXXXXXXXXXA ENtXXXXXXXXXXXXXXXXXqThis information is fantastic for both reporting purposes as well as possibly escalating access or obtaining sensitive information.
Checking for S3 buckets
Total # of buckets: 9
Bucket: bucket1 [8 objects]
Bucket: bucket2 [1000 objects]
Bucket: bucket3-dev [1000 objects]
Bucket: bucket4 [1 objects]
Bucket: bucket5 [747 objects]
Bucket: otherbucket [1000 objects]
Bucket: morebucket [54 objects]
Bucket: whereismahbucket [26 objects]
Bucket: ilikefish [95 objects]
Checking for EC2 Instances
Total # of EC2 Instances: 2
Instance: t2.nano - 52.570.576.548 - running
Instance: t2.small - stopped
Checking for Lightsail Instances
Total # of Lightsail instances: 1
Name: enumtest1, Username: ubuntu, IP: 254.244.198.296, State: running
Checking for DynamoDB Tables
Total # of DynamoDB Tables: 2
Table Name: tabletest
Table Name: test2
Currently, the script supports:
- EC2 Instances (type, IP, status)
- S3 Buckets (name, number of objects)
- Lightsail Instances (name, username, IP, state)
- DynamoDB (table name)
I am planning on adding support for additional services as time goes on.
The tool can be found at https://github.com/atucom/AWSEnumerator