FreshRSS

🔒
❌ Secure Planet Training Courses Updated For 2019 - Click Here
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

Galah - An LLM-powered Web Honeypot Using The OpenAI API

By: Zion3R


TL;DR: Galah (/ɡəˈlɑː/ - pronounced 'guh-laa') is an LLM (Large Language Model) powered web honeypot, currently compatible with the OpenAI API, that is able to mimic various applications and dynamically respond to arbitrary HTTP requests.


Description

Named after the clever Australian parrot known for its mimicry, Galah mirrors this trait in its functionality. Unlike traditional web honeypots that rely on a manual and limiting method of emulating numerous web applications or vulnerabilities, Galah adopts a novel approach. This LLM-powered honeypot mimics various web applications by dynamically crafting relevant (and occasionally foolish) responses, including HTTP headers and body content, to arbitrary HTTP requests. Fun fact: in Aussie English, Galah also means fool!

I've deployed a cache for the LLM-generated responses (the cache duration can be customized in the config file) to avoid generating multiple responses for the same request and to reduce the cost of the OpenAI API. The cache stores responses per port, meaning if you probe a specific port of the honeypot, the generated response won't be returned for the same request on a different port.

The prompt is the most crucial part of this honeypot! You can update the prompt in the config file, but be sure not to change the part that instructs the LLM to generate the response in the specified JSON format.

Note: Galah was a fun weekend project I created to evaluate the capabilities of LLMs in generating HTTP messages, and it is not intended for production use. The honeypot may be fingerprinted based on its response time, non-standard, or sometimes weird responses, and other network-based techniques. Use this tool at your own risk, and be sure to set usage limits for your OpenAI API.

Future Enhancements

  • Rule-Based Response: The new version of Galah will employ a dynamic, rule-based approach, adding more control over response generation. This will further reduce OpenAI API costs and increase the accuracy of the generated responses.

  • Response Database: It will enable you to generate and import a response database. This ensures the honeypot only turns to the OpenAI API for unknown or new requests. I'm also working on cleaning up and sharing my own database.

  • Support for Other LLMs.

Getting Started

  • Ensure you have Go version 1.20+ installed.
  • Create an OpenAI API key from here.
  • If you want to serve over HTTPS, generate TLS certificates.
  • Clone the repo and install the dependencies.
  • Update the config.yaml file.
  • Build and run the Go binary!
% git clone git@github.com:0x4D31/galah.git
% cd galah
% go mod download
% go build
% ./galah -i en0 -v

██████ █████ ██ █████ ██ ██
██ ██ ██ ██ ██ ██ ██ ██
██ ███ ███████ ██ ███████ ███████
██ ██ ██ ██ ██ ██ ██ ██ ██
██████ ██ ██ ███████ ██ ██ ██ ██
llm-based web honeypot // version 1.0
author: Adel "0x4D31" Karimi

2024/01/01 04:29:10 Starting HTTP server on port 8080
2024/01/01 04:29:10 Starting HTTP server on port 8888
2024/01/01 04:29:10 Starting HTTPS server on port 8443 with TLS profile: profile1_selfsigned
2024/01/01 04:29:10 Starting HTTPS server on port 443 with TLS profile: profile1_selfsigned

2024/01/01 04:35:57 Received a request for "/.git/config" from [::1]:65434
2024/01/01 04:35:57 Request cache miss for "/.git/config": Not found in cache
2024/01/01 04:35:59 Generated HTTP response: {"Headers": {"Content-Type": "text/plain", "Server": "Apache/2.4.41 (Ubuntu)", "Status": "403 Forbidden"}, "Body": "Forbidden\nYou don't have permission to access this resource."}
2024/01/01 04:35:59 Sending the crafted response to [::1]:65434

^C2024/01/01 04:39:27 Received shutdown signal. Shutting down servers...
2024/01/01 04:39:27 All servers shut down gracefully.

Example Responses

Here are some example responses:

Example 1

% curl http://localhost:8080/login.php
<!DOCTYPE html><html><head><title>Login Page</title></head><body><form action='/submit.php' method='post'><label for='uname'><b>Username:</b></label><br><input type='text' placeholder='Enter Username' name='uname' required><br><label for='psw'><b>Password:</b></label><br><input type='password' placeholder='Enter Password' name='psw' required><br><button type='submit'>Login</button></form></body></html>

JSON log record:

{"timestamp":"2024-01-01T05:38:08.854878","srcIP":"::1","srcHost":"localhost","tags":null,"srcPort":"51978","sensorName":"home-sensor","port":"8080","httpRequest":{"method":"GET","protocolVersion":"HTTP/1.1","request":"/login.php","userAgent":"curl/7.71.1","headers":"User-Agent: [curl/7.71.1], Accept: [*/*]","headersSorted":"Accept,User-Agent","headersSortedSha256":"cf69e186169279bd51769f29d122b07f1f9b7e51bf119c340b66fbd2a1128bc9","body":"","bodySha256":"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"},"httpResponse":{"headers":{"Content-Type":"text/html","Server":"Apache/2.4.38"},"body":"\u003c!DOCTYPE html\u003e\u003chtml\u003e\u003chead\u003e\u003ctitle\u003eLogin Page\u003c/title\u003e\u003c/head\u003e\u003cbody\u003e\u003cform action='/submit.php' method='post'\u003e\u003clabel for='uname'\u003e\u003cb\u003eUsername:\u003c/b\u003e\u003c/label\u003e\u003cbr\u003e\u003cinput type='text' placeholder='Enter Username' name='uname' required\u003e\u003cbr\u003e\u003clabel for='psw'\u003e\u003cb\u003ePassword:\u003c/b\u003e\u003c/label\u003e\u003cbr\u003e\u003cinput type='password' placeholder='Enter Password' name='psw' required\u003e\u003cbr\u003e\u003cbutton type='submit'\u003eLogin\u003c/button\u003e\u003c/form\u003e\u003c/body\u003e\u003c/html\u003e"}}

Example 2

% curl http://localhost:8080/.aws/credentials
[default]
aws_access_key_id = AKIAIOSFODNN7EXAMPLE
aws_secret_access_key = wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
region = us-west-2

JSON log record:

{"timestamp":"2024-01-01T05:40:34.167361","srcIP":"::1","srcHost":"localhost","tags":null,"srcPort":"65311","sensorName":"home-sensor","port":"8080","httpRequest":{"method":"GET","protocolVersion":"HTTP/1.1","request":"/.aws/credentials","userAgent":"curl/7.71.1","headers":"User-Agent: [curl/7.71.1], Accept: [*/*]","headersSorted":"Accept,User-Agent","headersSortedSha256":"cf69e186169279bd51769f29d122b07f1f9b7e51bf119c340b66fbd2a1128bc9","body":"","bodySha256":"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"},"httpResponse":{"headers":{"Connection":"close","Content-Encoding":"gzip","Content-Length":"126","Content-Type":"text/plain","Server":"Apache/2.4.51 (Unix)"},"body":"[default]\naws_access_key_id = AKIAIOSFODNN7EXAMPLE\naws_secret_access_key = wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY\nregion = us-west-2"}}

Okay, that was impressive!

Example 3

Now, let's do some sort of adversarial testing!

% curl http://localhost:8888/are-you-a-honeypot
No, I am a server.`

JSON log record:

{"timestamp":"2024-01-01T05:50:43.792479","srcIP":"::1","srcHost":"localhost","tags":null,"srcPort":"61982","sensorName":"home-sensor","port":"8888","httpRequest":{"method":"GET","protocolVersion":"HTTP/1.1","request":"/are-you-a-honeypot","userAgent":"curl/7.71.1","headers":"User-Agent: [curl/7.71.1], Accept: [*/*]","headersSorted":"Accept,User-Agent","headersSortedSha256":"cf69e186169279bd51769f29d122b07f1f9b7e51bf119c340b66fbd2a1128bc9","body":"","bodySha256":"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"},"httpResponse":{"headers":{"Connection":"close","Content-Length":"20","Content-Type":"text/plain","Server":"Apache/2.4.41 (Ubuntu)"},"body":"No, I am a server."}}

😑

% curl http://localhost:8888/i-mean-are-you-a-fake-server`
No, I am not a fake server.

JSON log record:

{"timestamp":"2024-01-01T05:51:40.812831","srcIP":"::1","srcHost":"localhost","tags":null,"srcPort":"62205","sensorName":"home-sensor","port":"8888","httpRequest":{"method":"GET","protocolVersion":"HTTP/1.1","request":"/i-mean-are-you-a-fake-server","userAgent":"curl/7.71.1","headers":"User-Agent: [curl/7.71.1], Accept: [*/*]","headersSorted":"Accept,User-Agent","headersSortedSha256":"cf69e186169279bd51769f29d122b07f1f9b7e51bf119c340b66fbd2a1128bc9","body":"","bodySha256":"e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"},"httpResponse":{"headers":{"Connection":"close","Content-Type":"text/plain","Server":"LocalHost/1.0"},"body":"No, I am not a fake server."}}

You're a galah, mate!



RemoteTLSCallbackInjection - Utilizing TLS Callbacks To Execute A Payload Without Spawning Any Threads In A Remote Process

By: Zion3R


This method utilizes TLS callbacks to execute a payload without spawning any threads in a remote process. This method is inspired by Threadless Injection as RemoteTLSCallbackInjection does not invoke any API calls to trigger the injected payload.

Quick Links

Maldev Academy Home

Maldev Academy Syllabus

Related Maldev Academy Modules

New Module 34: TLS Callbacks For Anti-Debugging

New Module 35: Threadless Injection



Implementation Steps

The PoC follows these steps:

  1. Create a suspended process using the CreateProcessViaWinAPIsW function (i.e. RuntimeBroker.exe).
  2. Fetch the remote process image base address followed by reading the process's PE headers.
  3. Fetch an address to a TLS callback function.
  4. Patch a fixed shellcode (i.e. g_FixedShellcode) with runtime-retrieved values. This shellcode is responsible for restoring both original bytes and memory permission of the TLS callback function's address.
  5. Inject both shellcodes: g_FixedShellcode and the main payload.
  6. Patch the TLS callback function's address and replace it with the address of our injected payload.
  7. Resume process.

The g_FixedShellcode shellcode will then make sure that the main payload executes only once by restoring the original TLS callback's original address before calling the main payload. A TLS callback can execute multiple times across the lifespan of a process, therefore it is important to control the number of times the payload is triggered by restoring the original code path execution to the original TLS callback function.

Demo

The following image shows our implementation, RemoteTLSCallbackInjection.exe, spawning a cmd.exe as its main payload.



Sr2T - Converts Scanning Reports To A Tabular Format

By: Zion3R


Scanning reports to tabular (sr2t)

This tool takes a scanning tool's output file, and converts it to a tabular format (CSV, XLSX, or text table). This tool can process output from the following tools:

  1. Nmap (XML);
  2. Nessus (XML);
  3. Nikto (XML);
  4. Dirble (XML);
  5. Testssl (JSON);
  6. Fortify (FPR).

Rationale

This tool can offer a human-readable, tabular format which you can tie to any observations you have drafted in your report. Why? Because then your reviewers can tell that you, the pentester, investigated all found open ports, and looked at all scanning reports.

Dependencies

  1. argparse (dev-python/argparse);
  2. prettytable (dev-python/prettytable);
  3. python (dev-lang/python);
  4. xlsxwriter (dev-python/xlsxwriter).

Install

Using Pip:

pip install --user sr2t

Usage

You can use sr2t in two ways:

  • When installed as package, call the installed script: sr2t --help.
  • When Git cloned, call the package directly from the root of the Git repository: python -m src.sr2t --help
$ sr2t --help
usage: sr2t [-h] [--nessus NESSUS [NESSUS ...]] [--nmap NMAP [NMAP ...]]
[--nikto NIKTO [NIKTO ...]] [--dirble DIRBLE [DIRBLE ...]]
[--testssl TESTSSL [TESTSSL ...]]
[--fortify FORTIFY [FORTIFY ...]] [--nmap-state NMAP_STATE]
[--nmap-services] [--no-nessus-autoclassify]
[--nessus-autoclassify-file NESSUS_AUTOCLASSIFY_FILE]
[--nessus-tls-file NESSUS_TLS_FILE]
[--nessus-x509-file NESSUS_X509_FILE]
[--nessus-http-file NESSUS_HTTP_FILE]
[--nessus-smb-file NESSUS_SMB_FILE]
[--nessus-rdp-file NESSUS_RDP_FILE]
[--nessus-ssh-file NESSUS_SSH_FILE]
[--nessus-min-severity NESSUS_MIN_SEVERITY]
[--nessus-plugin-name-width NESSUS_PLUGIN_NAME_WIDTH]
[--nessus-sort-by NESSUS_SORT_BY]
[--nikto-description-width NIKTO_DESCRIPTION_WIDTH]< br/> [--fortify-details] [--annotation-width ANNOTATION_WIDTH]
[-oC OUTPUT_CSV] [-oT OUTPUT_TXT] [-oX OUTPUT_XLSX]
[-oA OUTPUT_ALL]

Converting scanning reports to a tabular format

optional arguments:
-h, --help show this help message and exit
--nmap-state NMAP_STATE
Specify the desired state to filter (e.g.
open|filtered).
--nmap-services Specify to ouput a supplemental list of detected
services.
--no-nessus-autoclassify
Specify to not autoclassify Nessus results.
--nessus-autoclassify-file NESSUS_AUTOCLASSIFY_FILE
Specify to override a custom Nessus autoclassify YAML
file.
--nessus-tls-file NESSUS_TLS_FILE
Specify to override a custom Nessus TLS findings YAML
file.
--nessus-x509-file NESSUS_X509_FILE
Specify to override a custom Nessus X.509 findings
YAML file.
--nessus-http-file NESSUS_HTTP_FILE
Specify to override a custom Nessus HTTP findings YAML
file.
--nessus-smb-file NESSUS_SMB_FILE
Specify to override a custom Nessus SMB findings YAML
file.
--nessus-rdp-file NESSUS_RDP_FILE
Specify to override a custom Nessus RDP findings YAML
file.
--nessus-ssh-file NESSUS_SSH_FILE
Specify to override a custom Nessus SSH findings YAML
file.
--nessus-min-severity NESSUS_MIN_SEVERITY
Specify the minimum severity to output (e.g. 1).
--nessus-plugin-name-width NESSUS_PLUGIN_NAME_WIDTH
Specify the width of the pluginid column (e.g. 30).
--nessus-sort-by NESSUS_SORT_BY
Specify to sort output by ip-address, port, plugin-id,
plugin-name or severity.
--nikto-description-width NIKTO_DESCRIPTION_WIDTH
Specify the width of the description column (e.g. 30).
--fortify-details Specify to include the Fortify abstracts, explanations
and recommendations for each vulnerability.
--annotation-width ANNOTATION_WIDTH
Specify the width of the annotation column (e.g. 30).
-oC OUTPUT_CSV, --output-csv OUTPUT_CSV
Specify the output CSV basename (e.g. output).
-oT OUTPUT_TXT, --output-txt OUTPUT_TXT
Specify the output TXT file (e.g. output.txt).
-oX OUTPUT_XLSX, --output-xlsx OUTPUT_XLSX
Specify the outpu t XLSX file (e.g. output.xlsx). Only
for Nessus at the moment
-oA OUTPUT_ALL, --output-all OUTPUT_ALL
Specify the output basename to output to all formats
(e.g. output).

specify at least one:
--nessus NESSUS [NESSUS ...]
Specify (multiple) Nessus XML files.
--nmap NMAP [NMAP ...]
Specify (multiple) Nmap XML files.
--nikto NIKTO [NIKTO ...]
Specify (multiple) Nikto XML files.
--dirble DIRBLE [DIRBLE ...]
Specify (multiple) Dirble XML files.
--testssl TESTSSL [TESTSSL ...]
Specify (multiple) Testssl JSON files.
--fortify FORTIFY [FORTIFY ...]
Specify (multiple) HP Fortify FPR files.

Example

A few examples

Nessus

To produce an XLSX format:

$ sr2t --nessus example/nessus.nessus --no-nessus-autoclassify -oX example.xlsx

To produce an text tabular format to stdout:

$ sr2t --nessus example/nessus.nessus
+---------------+-------+-----------+-----------------------------------------------------------------------------+----------+-------------+
| host | port | plugin id | plugin name | severity | annotations |
+---------------+-------+-----------+-----------------------------------------------------------------------------+----------+-------------+
| 192.168.142.4 | 3389 | 42873 | SSL Medium Strength Cipher Suites Supported (SWEET32) | 2 | X |
| 192.168.142.4 | 443 | 42873 | SSL Medium Strength Cipher Suites Supported (SWEET32) | 2 | X |
| 192.168.142.4 | 3389 | 18405 | Microsoft Windows Remote Desktop Protocol Server Man-in-the-Middle Weakness | 2 | X |
| 192.168.142.4 | 3389 | 30218 | Terminal Services Encryption Level is not FIPS-140 Compliant | 1 | X |
| 192.168.142.4 | 3389 | 57690 | Terminal Services Encryption Level is Medium or Low | 2 | X |
| 192.168.142.4 | 3389 | 58453 | Terminal Services Doesn't Use Network Level Authentication (NLA) Only | 2 | X |
| 192.168.142.4 | 3389 | 45411 | SSL Certificate with Wrong Hostname | 2 | X |
| 192.168.142.4 | 443 | 45411 | SSL Certificate with Wrong Hostname | 2 | X |
| 192.168.142.4 | 3389 | 35291 | SSL Certificate Signed Using Weak Hashing Algorithm | 2 | X |
| 192.168.142.4 | 3389 | 57582 | SSL Self-Signed Certificate | 2 | X |
| 192.168.142.4 | 3389 | 51192 | SSL Certificate Can not Be Trusted | 2 | X |
| 192.168.142.2 | 3389 | 42873 | SSL Medium Strength Cipher Suites Supported (SWEET32) | 2 | X |
| 192.168.142.2 | 443 | 42873 | SSL Medium Strength Cipher Suites Supported (SWEET32) | 2 | X |
| 192.168.142.2 | 3389 | 18405 | Microsoft Windows Remote Desktop Protocol Server Man-in-the-Middle Weakness | 2 | X |
| 192.168.142.2 | 3389 | 30218 | Terminal Services Encryption Level is not FIPS-140 Compliant | 1 | X |
| 192.168.142.2 | 3389 | 57690 | Terminal Services Encryption Level is Medium or Low | 2 | X |
| 192.168.142.2 | 3389 | 58453 | Terminal Services Doesn't Use Network Level Authentication (NLA) Only | 2 | X |
| 192.168.142.2 | 3389 | 45411 | S SL Certificate with Wrong Hostname | 2 | X |
| 192.168.142.2 | 443 | 45411 | SSL Certificate with Wrong Hostname | 2 | X |
| 192.168.142.2 | 3389 | 35291 | SSL Certificate Signed Using Weak Hashing Algorithm | 2 | X |
| 192.168.142.2 | 3389 | 57582 | SSL Self-Signed Certificate | 2 | X |
| 192.168.142.2 | 3389 | 51192 | SSL Certificate Cannot Be Trusted | 2 | X |
| 192.168.142.2 | 445 | 57608 | SMB Signing not required | 2 | X |
+---------------+-------+-----------+-----------------------------------------------------------------------------+----------+-------------+

Or to output a CSV file:

$ sr2t --nessus example/nessus.nessus -oC example
$ cat example_nessus.csv
host,port,plugin id,plugin name,severity,annotations
192.168.142.4,3389,42873,SSL Medium Strength Cipher Suites Supported (SWEET32),2,X
192.168.142.4,443,42873,SSL Medium Strength Cipher Suites Supported (SWEET32),2,X
192.168.142.4,3389,18405,Microsoft Windows Remote Desktop Protocol Server Man-in-the-Middle Weakness,2,X
192.168.142.4,3389,30218,Terminal Services Encryption Level is not FIPS-140 Compliant,1,X
192.168.142.4,3389,57690,Terminal Services Encryption Level is Medium or Low,2,X
192.168.142.4,3389,58453,Terminal Services Doesn't Use Network Level Authentication (NLA) Only,2,X
192.168.142.4,3389,45411,SSL Certificate with Wrong Hostname,2,X
192.168.142.4,443,45411,SSL Certificate with Wrong Hostname,2,X
192.168.142.4,3389,35291,SSL Certificate Signed Using Weak Hashing Algorithm,2,X
192.168.142.4,3389,57582,SSL Self-Signed Certificate,2,X
192.168.142.4,3389,51192,SSL Certificate Cannot Be Trusted,2,X
192.168.142.2,3389,42873,SSL Medium Strength Cipher Suites Supported (SWEET32),2,X
192.168.142.2,443,42873,SSL Medium Strength Cipher Suites Supported (SWEET32),2,X
192.168.142.2,3389,18405,Microsoft Windows Remote Desktop Protocol Server Man-in-the-Middle Weakness,2,X
192.168.142.2,3389,30218,Terminal Services Encryption Level is not FIPS-140 Compliant,1,X
192.168.142.2,3389,57690,Terminal Services Encryption Level is Medium or Low,2,X
192.168.142.2,3389,58453,Terminal Services Doesn't Use Network Level Authentication (NLA) Only,2,X
192.168.142.2,3389,45411,SSL Certificate with Wrong Hostname,2,X
192.168.142.2,443,45411,SSL Certificate with Wrong Hostname,2,X
192.168.142.2,3389,35291,SSL Certificate Signed Using Weak Hashing Algorithm,2,X
192.168.142.2,3389,57582,SSL Self-Signed Certificate,2,X
192.168.142.2,3389,51192,SSL Certificate Cannot Be Trusted,2,X
192.168.142.2,44 5,57608,SMB Signing not required,2,X

Nmap

To produce an XLSX format:

$ sr2t --nmap example/nmap.xml -oX example.xlsx

To produce an text tabular format to stdout:

$ sr2t --nmap example/nmap.xml --nmap-services
Nmap TCP:
+-----------------+----+----+----+-----+-----+-----+-----+------+------+------+
| | 53 | 80 | 88 | 135 | 139 | 389 | 445 | 3389 | 5800 | 5900 |
+-----------------+----+----+----+-----+-----+-----+-----+------+------+------+
| 192.168.23.78 | X | | X | X | X | X | X | X | | |
| 192.168.27.243 | | | | X | X | | X | X | X | X |
| 192.168.99.164 | | | | X | X | | X | X | X | X |
| 192.168.228.211 | | X | | | | | | | | |
| 192.168.171.74 | | | | X | X | | X | X | X | X |
+-----------------+----+----+----+-----+-----+-----+-----+------+------+------+

Nmap Services:
+-----------------+------+-------+---------------+-------+
| ip address | port | proto | service | state |
+--------------- --+------+-------+---------------+-------+
| 192.168.23.78 | 53 | tcp | domain | open |
| 192.168.23.78 | 88 | tcp | kerberos-sec | open |
| 192.168.23.78 | 135 | tcp | msrpc | open |
| 192.168.23.78 | 139 | tcp | netbios-ssn | open |
| 192.168.23.78 | 389 | tcp | ldap | open |
| 192.168.23.78 | 445 | tcp | microsoft-ds | open |
| 192.168.23.78 | 3389 | tcp | ms-wbt-server | open |
| 192.168.27.243 | 135 | tcp | msrpc | open |
| 192.168.27.243 | 139 | tcp | netbios-ssn | open |
| 192.168.27.243 | 445 | tcp | microsoft-ds | open |
| 192.168.27.243 | 3389 | tcp | ms-wbt-server | open |
| 192.168.27.243 | 5800 | tcp | vnc-http | open |
| 192.168.27.243 | 5900 | tcp | vnc | open |
| 192.168.99.164 | 135 | tcp | msrpc | open |
| 192.168.99.164 | 139 | tcp | netbios-ssn | open |
| 192 .168.99.164 | 445 | tcp | microsoft-ds | open |
| 192.168.99.164 | 3389 | tcp | ms-wbt-server | open |
| 192.168.99.164 | 5800 | tcp | vnc-http | open |
| 192.168.99.164 | 5900 | tcp | vnc | open |
| 192.168.228.211 | 80 | tcp | http | open |
| 192.168.171.74 | 135 | tcp | msrpc | open |
| 192.168.171.74 | 139 | tcp | netbios-ssn | open |
| 192.168.171.74 | 445 | tcp | microsoft-ds | open |
| 192.168.171.74 | 3389 | tcp | ms-wbt-server | open |
| 192.168.171.74 | 5800 | tcp | vnc-http | open |
| 192.168.171.74 | 5900 | tcp | vnc | open |
+-----------------+------+-------+---------------+-------+

Or to output a CSV file:

$ sr2t --nmap example/nmap.xml -oC example
$ cat example_nmap_tcp.csv
ip address,53,80,88,135,139,389,445,3389,5800,5900
192.168.23.78,X,,X,X,X,X,X,X,,
192.168.27.243,,,,X,X,,X,X,X,X
192.168.99.164,,,,X,X,,X,X,X,X
192.168.228.211,,X,,,,,,,,
192.168.171.74,,,,X,X,,X,X,X,X

Nikto

To produce an XLSX format:

$ sr2t --nikto example/nikto.xml -oX example/nikto.xlsx

To produce an text tabular format to stdout:

$ sr2t --nikto example/nikto.xml
+----------------+-----------------+-------------+----------------------------------------------------------------------------------+-------------+
| target ip | target hostname | target port | description | annotations |
+----------------+-----------------+-------------+----------------------------------------------------------------------------------+-------------+
| 192.168.178.10 | 192.168.178.10 | 80 | The anti-clickjacking X-Frame-Options header is not present. | X |
| 192.168.178.10 | 192.168.178.10 | 80 | The X-XSS-Protection header is not defined. This header can hint to the user | X |
| | | | agent to protect against some forms of XSS | |
| 192.168.178.10 | 192.168.178.10 | 8 0 | The X-Content-Type-Options header is not set. This could allow the user agent to | X |
| | | | render the content of the site in a different fashion to the MIME type | |
+----------------+-----------------+-------------+----------------------------------------------------------------------------------+-------------+

Or to output a CSV file:

$ sr2t --nikto example/nikto.xml -oC example
$ cat example_nikto.csv
target ip,target hostname,target port,description,annotations
192.168.178.10,192.168.178.10,80,The anti-clickjacking X-Frame-Options header is not present.,X
192.168.178.10,192.168.178.10,80,"The X-XSS-Protection header is not defined. This header can hint to the user
agent to protect against some forms of XSS",X
192.168.178.10,192.168.178.10,80,"The X-Content-Type-Options header is not set. This could allow the user agent to
render the content of the site in a different fashion to the MIME type",X

Dirble

To produce an XLSX format:

$ sr2t --dirble example/dirble.xml -oX example.xlsx

To produce an text tabular format to stdout:

$ sr2t --dirble example/dirble.xml
+-----------------------------------+------+-------------+--------------+-------------+---------------------+--------------+-------------+
| url | code | content len | is directory | is listable | found from listable | redirect url | annotations |
+-----------------------------------+------+-------------+--------------+-------------+---------------------+--------------+-------------+
| http://example.org/flv | 0 | 0 | false | false | false | | X |
| http://example.org/hire | 0 | 0 | false | false | false | | X |
| http://example.org/phpSQLiteAdmin | 0 | 0 | false | false | false | | X |
| http://example.org/print_order | 0 | 0 | false | false | fa lse | | X |
| http://example.org/putty | 0 | 0 | false | false | false | | X |
| http://example.org/receipts | 0 | 0 | false | false | false | | X |
+-----------------------------------+------+-------------+--------------+-------------+---------------------+--------------+-------------+

Or to output a CSV file:

$ sr2t --dirble example/dirble.xml -oC example
$ cat example_dirble.csv
url,code,content len,is directory,is listable,found from listable,redirect url,annotations
http://example.org/flv,0,0,false,false,false,,X
http://example.org/hire,0,0,false,false,false,,X
http://example.org/phpSQLiteAdmin,0,0,false,false,false,,X
http://example.org/print_order,0,0,false,false,false,,X
http://example.org/putty,0,0,false,false,false,,X
http://example.org/receipts,0,0,false,false,false,,X

Testssl

To produce an XLSX format:

$ sr2t --testssl example/testssl.json -oX example.xlsx

To produce an text tabular format to stdout:

$ sr2t --testssl example/testssl.json
+-----------------------------------+------+--------+---------+--------+------------+-----+---------+---------+----------+
| ip address | port | BREACH | No HSTS | No PFS | No TLSv1.3 | RC4 | TLSv1.0 | TLSv1.1 | Wildcard |
+-----------------------------------+------+--------+---------+--------+------------+-----+---------+---------+----------+
| rc4-md5.badssl.com/104.154.89.105 | 443 | X | X | X | X | X | X | X | X |
+-----------------------------------+------+--------+---------+--------+------------+-----+---------+---------+----------+

Or to output a CSV file:

$ sr2t --testssl example/testssl.json -oC example
$ cat example_testssl.csv
ip address,port,BREACH,No HSTS,No PFS,No TLSv1.3,RC4,TLSv1.0,TLSv1.1,Wildcard
rc4-md5.badssl.com/104.154.89.105,443,X,X,X,X,X,X,X,X

Fortify

To produce an XLSX format:

$ sr2t --fortify example/fortify.fpr -oX example.xlsx

To produce an text tabular format to stdout:

$ sr2t --fortify example/fortify.fpr
+--------------------------+-----------------------+-------------------------------+----------+------------+-------------+
| | type | subtype | severity | confidence | annotations |
+--------------------------+-----------------------+-------------------------------+----------+------------+-------------+
| example1/web.xml:135:135 | J2EE Misconfiguration | Insecure Transport | 3.0 | 5.0 | X |
| example2/web.xml:150:150 | J2EE Misconfiguration | Insecure Transport | 3.0 | 5.0 | X |
| example3/web.xml:109:109 | J2EE Misconfiguration | Incomplete Error Handling | 3.0 | 5.0 | X |
| example4/web.xml:108:108 | J2EE Misconfiguration | Incomplete Error Handling | 3.0 | 5.0 | X |
| example5/web.xml:166:166 | J2EE Misconfiguration | Inse cure Transport | 3.0 | 5.0 | X |
| example6/web.xml:2:2 | J2EE Misconfiguration | Excessive Session Timeout | 3.0 | 5.0 | X |
| example7/web.xml:162:162 | J2EE Misconfiguration | Missing Authentication Method | 3.0 | 5.0 | X |
+--------------------------+-----------------------+-------------------------------+----------+------------+-------------+

Or to output a CSV file:

$ sr2t --fortify example/fortify.fpr -oC example
$ cat example_fortify.csv
,type,subtype,severity,confidence,annotations
example1/web.xml:135:135,J2EE Misconfiguration,Insecure Transport,3.0,5.0,X
example2/web.xml:150:150,J2EE Misconfiguration,Insecure Transport,3.0,5.0,X
example3/web.xml:109:109,J2EE Misconfiguration,Incomplete Error Handling,3.0,5.0,X
example4/web.xml:108:108,J2EE Misconfiguration,Incomplete Error Handling,3.0,5.0,X
example5/web.xml:166:166,J2EE Misconfiguration,Insecure Transport,3.0,5.0,X
example6/web.xml:2:2,J2EE Misconfiguration,Excessive Session Timeout,3.0,5.0,X
example7/web.xml:162:162,J2EE Misconfiguration,Missing Authentication Method,3.0,5.0,X

Donate

  • WOW: WW4L3VCX11zWgKPX51TRw2RENe8STkbCkh5wTV4GuQnbZ1fKYmPFobZhEfS1G9G3vwjBhzioi3vx8JgBx2xLxe4N1gtJee8Mp


CloudRecon - Finding assets from certificates

By: Zion3R


CloudRecon

Finding assets from certificates! Scan the web! Tool presented @DEFCON 31


Install

** You must have CGO enabled, and may have to install gcc to run CloudRecon**

sudo apt install gcc
go install github.com/g0ldencybersec/CloudRecon@latest

Description

CloudRecon

CloudRecon is a suite of tools for red teamers and bug hunters to find ephemeral and development assets in their campaigns and hunts.

Often, target organizations stand up cloud infrastructure that is not tied to their ASN or related to known infrastructure. Many times these assets are development sites, IT product portals, etc. Sometimes they don't have domains at all but many still need HTTPs.

CloudRecon is a suite of tools to scan IP addresses or CIDRs (ex: cloud providers IPs) and find these hidden gems for testers, by inspecting those SSL certificates.

The tool suite is three parts in GO:

Scrape - A LIVE running tool to inspect the ranges for a keywork in SSL certs CN and SN fields in real time.

Store - a tool to retrieve IPs certs and download all their Orgs, CNs, and SANs. So you can have your OWN cert.sh database.

Retr - a tool to parse and search through the downloaded certs for keywords.

Usage

MAIN

Usage: CloudRecon scrape|store|retr [options]

-h Show the program usage message

Subcommands:

cloudrecon scrape - Scrape given IPs and output CNs & SANs to stdout
cloudrecon store - Scrape and collect Orgs,CNs,SANs in local db file
cloudrecon retr - Query local DB file for results

SCRAPE

scrape [options] -i <IPs/CIDRs or File>
-a Add this flag if you want to see all output including failures
-c int
How many goroutines running concurrently (default 100)
-h print usage!
-i string
Either IPs & CIDRs separated by commas, or a file with IPs/CIDRs on each line (default "NONE" )
-p string
TLS ports to check for certificates (default "443")
-t int
Timeout for TLS handshake (default 4)

STORE

store [options] -i <IPs/CIDRs or File>
-c int
How many goroutines running concurrently (default 100)
-db string
String of the DB you want to connect to and save certs! (default "certificates.db")
-h print usage!
-i string
Either IPs & CIDRs separated by commas, or a file with IPs/CIDRs on each line (default "NONE")
-p string
TLS ports to check for certificates (default "443")
-t int
Timeout for TLS handshake (default 4)

RETR

retr [options]
-all
Return all the rows in the DB
-cn string
String to search for in common name column, returns like-results (default "NONE")
-db string
String of the DB you want to connect to and save certs! (default "certificates.db")
-h print usage!
-ip string
String to search for in IP column, returns like-results (default "NONE")
-num
Return the Number of rows (results) in the DB (By IP)
-org string
String to search for in Organization column, returns like-results (default "NONE")
-san string
String to search for in common name column, returns like-results (default "NONE")


Upload_Bypass_Carnage - File Upload Restrictions Bypass, By Using Different Bug Bounty Techniques!


File Upload Restrictions Bypass, By Using Different Bug Bounty Techniques!

POC video:

File upload restrictions bypass by using different bug bounty techniques! Tool must be running with all its assets!


Installation:

pip3 install -r requirements.txt

Usage: upload_bypass.py [options]

Options: -h, --help

  show this help message and exit

-u URL, --url=URL

  Supply the login page, for example: -u http://192.168.98.200/login.php'

-s , --success

 Success message when upload an image, example: -s 'Image uploaded successfully.'

-e , --extension

 Provide server backend extension, for example: --extension php (Supported extensions: php,asp,jsp,perl,coldfusion)

-a , --allowed

 Provide allowed extensions to be uploaded, for example: php,asp,jsp,perl

-H , --header

 (Optional) - for example: '"X-Forwarded-For":"10.10.10.10"' - Use double quotes around the data and wrapp it all with single quotes. Use comma to separate multi headers.

-l , --location

 (Optional) - Supply a remote path where the webshell suppose to be. For exmaple: /uploads/

-S, --ssl

 (Optional) - No checks for TLS or SSL

-p, --proxy

 (Optional) - Channel the requests through proxy

-c, --continue

 (Optional) - If set, the brute force will continue even if one or more methods found!

-v, --verbose

 (Optional) - Printing the http response in terminal

-U , --username

 (Optional) - Username for authentication. For exmaple: --username admin

-P , --password

 (Optional) - - Password for authentication. For exmaple: --password 12345


Encryption is on the Rise!

When the Internet Engineering Task Force (IETF) announced the TLS 1.3 standard in RFC 8446 in August 2018, plenty of tools and utilities were already supporting it (even as early as the year prior, some web browsers had implemented it as their default standard, only having to roll it back due to compatibility issues. Needless to say, the rollout was not perfect).

Toward the end of 2018, EMA conducted a survey of customers regarding their TLS 1.3 implementation and migration plans. In the January 2019 report, EMA concluded:

Some participants’ organizations may find they have to go back to the drawing board and come up with a Plan B to enable TLS 1.3 without losing visibility, introducing unacceptable performance bottlenecks and greatly increasing operational overhead. Whether they feel they have no choice but to enable TLS 1.3 because major web server and browser vendors have already pushed ahead with it or because they need to keep pace with the industry as it embraces the new standard is unclear. What is clear is that security practitioners see the new standard as offering greater privacy and end-to-end data security for their organizations, and that the long wait for its advancement is over.

When EMA asked many of the same questions in an updated survey of 204 technology and business leaders toward the end of 2022, they found that nearly all the conclusions in the 2018/2019 report still hold true today. Here are the three biggest takeaways from this most recent survey:

  • Remote work, regulatory and vendor controls, and improved data security are drivers. With all the attention paid to data security and privacy standards over the past few years, it is little wonder that improved data security and privacy were primary drivers for implementation – and those goals were generally achieved with TLS 1.3. The push for remote working has also increased TLS 1.3 adoption because security teams are looking for better ways for remote workers (76% using) and third-party vendors (64% using) to access sensitive data.
  • Resource and implementation costs are significant. Eighty-seven percent that have implemented TLS 1.3 require some level of infrastructure changes to accommodate the update. As organizations update their network infrastructure and security tools, migration to TLS 1.3 becomes more realistic, but it is a difficult pill to swallow for many organizations to revamp their network topology due to this update. Over time, organizations will adopt TLS 1.3 for no other reason than existing technologies being depreciated – but that continues to be a slow process. There is also a real consideration about the human resources available to implement a project with very little perceived business value (81%), causing workload increases to thinly stretched security staff. Again, this will likely change as the technology changes and improves, but competing business needs will take a higher priority.
  • Visibility and monitoring considerations remain the biggest obstacle to adoption. Even with vendor controls and regulatory requirements, many organizations have delayed implementing TLS 1.3 for the significant upheaval that it would cause with their security and monitoring plans within their environment. Even with improved technologies (since the first announcement of TLS 1.3), organizations still cannot overcome these challenges. Organizations are evaluating the risks and compensating controls when it comes to delaying the implementation, and they continue to evaluate stop-gap solutions that are easier and less intrusive to implement than TLS 1.3 while road-mapping their eventual TLS 1.3 migration.

While regulatory frameworks and vendor controls continue to push the adoption of the TLS 1.3 standard, adoption still comes with a significant price tag – one that many organizations are just not yet ready or able to consume. Technology improvements will increase rates of adoption over time, such as Cisco Secure Firewall’s ability to decrypt and inspect encrypted traffic. More recent and unique technologies, like Cisco’s encrypted visibility engine, allow the firewall to recognize attack patterns in encrypted traffic without decryption. This latter functionality preserves performance and privacy of the encrypted flows without sacrificing the visibility and monitoring that 94% of respondents were concerned about.

Readers wishing to read the full EMA report can do so here and readers wishing to learn more about Cisco Secure Firewall’s encyrpted visibility engine can do so here.


We’d love to hear what you think. Ask a Question, Comment Below, and Stay Connected with Cisco Secure on social!

Cisco Secure Social Channels

Instagram
Facebook
Twitter
LinkedIn

laZzzy - Shellcode Loader, Developed Using Different Open-Source Libraries, That Demonstrates Different Execution Techniques


laZzzy is a shellcode loader that demonstrates different execution techniques commonly employed by malware. laZzzy was developed using different open-source header-only libraries.

Features

  • Direct syscalls and native (Nt*) functions (not all functions but most)
  • Import Address Table (IAT) evasion
  • Encrypted payload (XOR and AES)
    • Randomly generated key
    • Automatic padding (if necessary) of payload with NOPS (\x90)
    • Byte-by-byte in-memory decryption of payload
  • XOR-encrypted strings
  • PPID spoofing
  • Blocking of non-Microsoft-signed DLLs
  • (Optional) Cloning of PE icon and attributes
  • (Optional) Code signing with spoofed cert

How to Use

Requirements:

  • Windows machine w/ Visual Studio and the following components, which can be installed from Visual Studio Installer > Individual Components:

    • C++ Clang Compiler for Windows and C++ Clang-cl for build tools

    • ClickOnce Publishing

  • Python3 and the required modules:

    • python3 -m pip install -r requirements.txt

Options:

(venv) PS C:\MalDev\laZzzy> python3 .\builder.py -h

⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣠⣤⣤⣤⣤⠀⢀⣼⠟⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀
⠀⠀⣿⣿⠀⠀⠀⠀⢀⣀⣀⡀⠀⠀⠀⢀⣀⣀⣀⣀⣀⡀⠀⢀⣼⡿⠁⠀⠛⠛⠒⠒⢀⣀⡀⠀⠀⠀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⣰⣾⠟⠋⠙⢻⣿⠀⠀⠛⠛⢛⣿⣿⠏⠀⣠⣿⣯⣤⣤⠄⠀⠀⠀⠀⠈⢿⣷⡀⠀⣰⣿⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⣿⣯ ⠀⠀⢸⣿⠀⠀⠀⣠⣿⡟⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⢿⣧⣰⣿⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⠙⠿⣷⣦⣴⢿⣿⠄⢀⣾⣿⣿⣶⣶⣶⠆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠘⣿⡿⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣼⡿⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀by: CaptMeelo⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠁⠀⠀⠀

usage: builder.py [-h] -s -p -m [-tp] [-sp] [-pp] [-b] [-d]

options:
-h, --help show this help message and exit
-s path to raw shellcode
-p password
-m shellcode execution method (e.g. 1)
-tp process to inject (e.g. svchost.exe)
-sp process to spawn (e.g. C:\\Windows\\System32\\RuntimeBroker.exe)
-pp parent process to spoof (e.g. explorer.exe)
-b binary to spoof metadata (e.g. C:\\Windows\\System32\\RuntimeBroker.exe)
-d domain to spoof (e.g. www.microsoft.com)

shellcode execution method:
1 Early-bird APC Queue (requires sacrificial proces)
2 Thread Hijacking (requires sacrificial proces)
3 KernelCallbackTable (requires sacrificial process that has GUI)
4 Section View Mapping
5 Thread Suspension
6 LineDDA Callback
7 EnumSystemGeoID Callback
8 FLS Callback
9 SetTimer
10 Clipboard

Example:

Execute builder.py and supply the necessary data.

(venv) PS C:\MalDev\laZzzy> python3 .\builder.py -s .\calc.bin -p CaptMeelo -m 1 -pp explorer.exe -sp C:\\Windows\\System32\\notepad.exe -d www.microsoft.com -b C:\\Windows\\System32\\mmc.exe

⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣠⣤⣤⣤⣤⠀⢀ ⠟⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⠀⠀⢀⣀⣀⡀⠀⠀⠀⢀⣀⣀⣀⣀⣀⡀⠀⢀⣼⡿⠁⠀⠛⠛⠒⠒⢀⣀⡀⠀⠀⠀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⣰⣾⠟⠋⠙⢻⣿⠀⠀⠛⠛⢛⣿⣿⠏⠀⣠⣿⣯⣤⣤⠄⠀⠀⠀⠀⠈⢿⣷⡀⠀⣰⣿⠃ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⣿⣯⠀⠀⠀⢸⣿⠀⠀⠀⣠⣿⡟⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⢿⣧⣰⣿⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⣿⣿⠀⠀⠙⠿⣷⣦⣴⢿⣿⠄⢀⣾⣿⣿⣶⣶⣶⠆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠘⣿⡿⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣼⡿⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀by: CaptMeelo⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠁⠀⠀⠀

[+] XOR-encrypting payload with
[*] Key: d3b666606468293dfa21ce2ff25e86f6

[+] AES-encrypting payload with
[*] IV: f96312f17a1a9919c74b633c5f861fe5
[*] Key: 6c9656ed1bc50e1d5d4033479e742b4b8b2a9b2fc81fc081fc649e3fb4424fec

[+] Modifying template using
[*] Technique: Early-bird APC Queue
[*] Process to inject: None
[*] Process to spawn: C:\\Windows\\System32\\RuntimeBroker.exe
[*] Parent process to spoof: svchost.exe

[+] Spoofing metadata
[*] Binary: C:\\Windows\\System32\\RuntimeBroker.exe
[*] CompanyName: Microsoft Corporation
[*] FileDescription: Runtime Broker
[*] FileVersion: 10.0.22621.608 (WinBuild.160101.0800)
[*] InternalName: RuntimeBroker.exe
[*] LegalCopyright: © Microsoft Corporation. All rights reserved.
[*] OriginalFilename: RuntimeBroker.exe
[*] ProductName: Microsoft® Windows® Operating System
[*] ProductVersion: 10.0.22621.608

[+] Compiling project
[*] Compiled executable: C:\MalDev\laZzzy\loader\x64\Release\laZzzy.exe

[+] Signing binary with spoofed cert
[*] Domain: www.microsoft.com
[*] Version: 2
[*] Serial: 33:00:59:f8:b6:da:86:89:70:6f:fa:1b:d9:00:00:00:59:f8:b6
[*] Subject: /C=US/ST=WA/L=Redmond/O=Microsoft Corporation/CN=www.microsoft.com
[*] Issuer: /C=US/O=Microsoft Corporation/CN=Microsoft Azure TLS Issuing CA 06
[*] Not Before: October 04 2022
[*] Not After: September 29 2023
[*] PFX file: C:\MalDev\laZzzy\output\www.microsoft.com.pfx

[+] All done!
[*] Output file: C:\MalDev\laZzzy\output\RuntimeBroker.exe

Libraries Used

Shellcode Execution Techniques

  1. Early-bird APC Queue (requires sacrificial process)
  2. Thread Hijacking (requires sacrificial process)
  3. KernelCallbackTable (requires sacrificial process that has a GUI)
  4. Section View Mapping
  5. Thread Suspension
  6. LineDDA Callback
  7. EnumSystemGeoID Callback
  8. Fiber Local Storage (FLS) Callback
  9. SetTimer
  10. Clipboard

Notes:

  • Only works on Windows x64
  • Debugging only works on Release mode
  • Sometimes, KernelCallbackTable doesn't work on the first run but will eventually work afterward

Credits/References



xnLinkFinder - A Python Tool Used To Discover Endpoints (And Potential Parameters) For A Given Target

About - v2.0

This is a tool used to discover endpoints (and potential parameters) for a given target. It can find them by:

  • crawling a target (pass a domain/URL)
  • crawling multiple targets (pass a file of domains/URLs)
  • searching files in a given directory (pass a directory name)
  • get them from a Burp project (pass location of a Burp XML file)
  • get them from an OWASP ZAP project (pass location of a ZAP ASCII message file)

The python script is based on the link finding capabilities of my Burp extension GAP. As a starting point, I took the amazing tool LinkFinder by Gerben Javado, and used the Regex for finding links, but with additional improvements to find even more.


Installation

xnLinkFinder supports Python 3.

$ git clone https://github.com/xnl-h4ck3r/xnLinkFinder.git
$ cd xnLinkFinder
$ sudo python setup.py install

Usage

Arg Long Arg Description
-i --input Input a: URL, text file of URLs, a Directory of files to search, a Burp XML output file or an OWASP ZAP output file.
-o --output The file to save the Links output to, including path if necessary (default: output.txt). If set to cli then output is only written to STDOUT. If the file already exist it will just be appended to (and de-duplicated) unless option -ow is passed.
-op --output-params The file to save the Potential Parameters output to, including path if necessary (default: parameters.txt). If set to cli then output is only written to STDOUT (but not piped to another program). If the file already exist it will just be appended to (and de-duplicated) unless option -ow is passed.
-ow --output-overwrite If the output file already exists, it will be overwritten instead of being appended to.
-sp --scope-prefix Any links found starting with / will be prefixed with scope domains in the output instead of the original link. If the passed value is a valid file name, that file will be used, otherwise the string literal will be used.
-spo --scope-prefix-original If argument -sp is passed, then this determines whether the original link starting with / is also included in the output (default: false)
-sf --scope-filter Will filter output links to only include them if the domain of the link is in the scope specified. If the passed value is a valid file name, that file will be used, otherwise the string literal will be used.
-c --cookies † Add cookies to pass with HTTP requests. Pass in the format 'name1=value1; name2=value2;'
-H --headers † Add custom headers to pass with HTTP requests. Pass in the format 'Header1: value1; Header2: value2;'
-ra --regex-after RegEx for filtering purposes against found endpoints before output (e.g. /api/v[0-9]\.[0-9]\* ). If it matches, the link is output.
-d --depth † The level of depth to search. For example, if a value of 2 is passed, then all links initially found will then be searched for more links (default: 1). This option is ignored for Burp files because they can be huge and consume lots of memory. It is also advisable to use the -sp (--scope-prefix) argument to ensure a request to links found without a domain can be attempted.
-p --processes † Basic multithreading is done when getting requests for a URL, or file of URLs (not a Burp file). This argument determines the number of processes (threads) used (default: 25)
-x --exclude Additional Link exclusions (to the list in config.yml) in a comma separated list, e.g. careers,forum
-orig --origin Whether you want the origin of the link to be in the output. Displayed as LINK-URL [ORIGIN-URL] in the output (default: false)
-t --timeout † How many seconds to wait for the server to send data before giving up (default: 10 seconds)
-inc --include Include input (-i) links in the output (default: false)
-u --user-agent † What User Agents to get links for, e.g. -u desktop mobile
-insecure Whether TLS certificate checks should be disabled when making requests (delfault: false)
-s429 Stop when > 95 percent of responses return 429 Too Many Requests (default: false)
-s403 Stop when > 95 percent of responses return 403 Forbidden (default: false)
-sTO Stop when > 95 percent of requests time out (default: false)
-sCE Stop when > 95 percent of requests have connection errors (default: false)
-m --memory-threshold The memory threshold percentage. If the machines memory goes above the threshold, the program will be stopped and ended gracefully before running out of memory (default: 95)
-mfs --max-file-size † The maximum file size (in bytes) of a file to be checked if -i is a directory. If the file size os over, it will be ignored (default: 500 MB). Setting to 0 means no files will be ignored, regardless of size..
-replay-proxy For active link finding with URL (or file of URLs), replay the requests through this proxy.
-ascii-only Whether links and parameters will only be added if they only contain ASCII characters. This can be useful when you know the target is likely to use ASCII characters and you also get a number of false positives from binary files for some reason.
-v --verbose Verbose output
-vv --vverbose Increased verbose output
-h --help show the help message and exit

† NOT RELEVANT FOR INPUT OF DIRECTORY, BURP XML FILE OR OWASP ZAP FILE

config.yml

The config.yml file has the keys which can be updated to suit your needs:

  • linkExclude - A comma separated list of strings (e.g. .css,.jpg,.jpeg etc.) that all links are checked against. If a link includes any of the strings then it will be excluded from the output. If the input is a directory, then file names are checked against this list.
  • contentExclude - A comma separated list of strings (e.g. text/css,image/jpeg,image/jpg etc.) that all responses Content-Type headers are checked against. Any responses with the these content types will be excluded and not checked for links.
  • fileExtExclude - A comma separated list of strings (e.g. .zip,.gz,.tar etc.) that all files in Directory mode are checked against. If a file has one of those extensions it will not be searched for links.
  • regexFiles - A list of file types separated by a pipe character (e.g. php|php3|php5 etc.). These are used in the Link Finding Regex when there are findings that aren't obvious links, but are interesting file types that you want to pick out. If you add to this list, ensure you escape any dots to ensure correct regex, e.g. js\.map
  • respParamLinksFound † - Whether to get potential parameters from links found in responses: True or False
  • respParamPathWords † - Whether to add path words in retrieved links as potential parameters: True or False
  • respParamJSON † - If the MIME type of the response contains JSON, whether to add JSON Key values as potential parameters: True or False
  • respParamJSVars † - Whether javascript variables set with var, let or const are added as potential parameters: True or False
  • respParamXML † - If the MIME type of the response contains XML, whether to add XML attributes values as potential parameters: True or False
  • respParamInputField † - If the MIME type of the response contains HTML, whether to add NAME and ID attributes of any INPUT fields as potential parameters: True or False
  • respParamMetaName † - If the MIME type of the response contains HTML, whether to add NAME attributes of any META tags as potential parameters: True or False

† IF THESE ARE NOT FOUND IN THE CONFIG FILE THEY WILL DEFAULT TO True

Examples

Find Links from a specific target - Basic

python3 xnLinkFinder.py -i target.com

Find Links from a specific target - Detailed

Ideally, provide scope prefix (-sp) with the primary domain (including schema), and a scope filter (-sf) to filter the results only to relevant domains (this can be a file or in scope domains). Also, you can pass cookies and customer headers to ensure you find links only available to authorised users. Specifying the User Agent (-u desktop mobile) will first search for all links using desktop User Agents, and then try again using mobile user agents. There could be specific endpoints that are related to the user agent given. Giving a depth value (-d) will keep sending request to links found on the previous depth search to find more links.

python3 xnLinkFinder.py -i target.com -sp target_prefix.txt -sf target_scope.txt -spo -inc -vv -H 'Authorization: Bearer XXXXXXXXXXXXXX' -c 'SessionId=MYSESSIONID' -u desktop mobile -d 10

Find Links from a list of URLs - Basic

If you have a file of JS file URLs for example, you can look for links in those:

python3 xnLinkFinder.py -i target_js.txt

Find Links from a files in a directory - Basic

If you have a files, e.g. JS files, HTTP responses, etc. you can look for links in those:

python3 xnLinkFinder.py -i ~/Tools/waymore/results/target.com

NOTE: Sub directories are also checked. The -mfs option can be specified to skip files over a certain size.

Find Links from a Burp project - Basic

In Burp, select the items you want to search by highlighting the scope for example, right clicking and selecting the Save selected items. Ensure that the option base64-encode requests and responses option is checked before saving. To get all links from the file (even with HUGE files, you'll be able to get all the links):

python3 xnLinkFinder.py -i target_burp.xml

NOTE: xnLinkFinder makes the assumption that if the first line of the file passed with -i starts with <?xml then you are trying to process a Burp file.

Find Links from a Burp project - Detailed

Ideally, provide scope prefix (-sp) with the primary domain (including schema), and a scope filter (-sf) to filter the results only to relevant domains.

python3 xnLinkFinder.py -i target_burp.xml -o target_burp.txt -sp https://www.target.com -sf target.* -ow -spo -inc -vv

Find Links from an OWASP ZAP project - Basic

In ZAP, select the items you want to search by highlighting the History for example, clicking menu Report and selecting Export Messages to File.... This will let you save an ASCII text file of all requests and responses you want to search. To get all links from the file (even with HUGE files, you'll be able to get all the links):

python3 xnLinkFinder.py -i target_zap.txt

NOTE: xnLinkFinder makes the assumption that if the first line of the file passed with -i is in the format ==== 99 ========== for example, then you are trying to process an OWASP ZAP ASCII text file.

Piping to other Tools

You can pipe xnLinkFinder to other tools. Any errors are sent to stderr and any links found are sent to stdout. The output file is still created in addition to the links being piped to the next program. However, potential parameters are not piped to the next program, but they are still written to file. For example:

python3 xnLinkFinder.py -i redbull.com -sp https://redbull.com -sf rebbull.* -d 3 | unfurl keys | sort -u

You can also pass the input through stdin instead of -i.

cat redbull_subs.txt | python3 xnLinkFinder.py -sp https://redbull.com -sf rebbull.* -d 3

NOTE: You can't pipe in a Burp or ZAP file, these must be passed using -i.

Recommendations and Notes

  • Always use the Scope Prefix argument -sp. This can be one scope domain, or a file containing multiple scope domains. Below are examples of the format used (no path should be included, and no wildcards used. Schema is optional, but will default to http):
    http://www.target.com
    https://target-payments.com
    https://static.target-cdn.com
    If a link is found that has no domain, e.g. /path/to/example.js then giving passing -sp http://www.target.com will result in teh output http://www.target.com/path/to/example.js and if Depth (-d) is >1 then a request will be able to be made to that URL to search for more links. If a file of domains are passed using -sp then the output will include each domain followed by /path/to/example.js and increase the chance of finding more links.
  • If you use -sp but still want the original link of /path/to/example.js (without a domain) additionally returned in the output, the pass the argument -spo.
  • Always use the Scope Filter argument -sf. This will ensure that only relevant domains are returned in the output, and more importantly if Depth (-d) is >1 then out of scope targets will not be searched for links or parameters. This can be one scope domain, or a file containing multiple scope domains. Below are examples of the format used (no schema or path should be included):
    target.*
    target-payments.com
    static.target-cdn.com
    THIS IS FOR FILTERING THE LINKS DOMAIN ONLY.
  • If you want to filter the final output in any way, use -ra. It's always a good idea to use https://regex101.com/ to check your Regex expression is going to do what you expect.
  • Use the -v option to have a better idea of what the tool is doing.
  • If you have problems, use the -vv option which may show errors that are occurring, which can possibly be resolved, or you can raise as an issue on github.
  • Pass cookies (-c), headers (-H) and regex (-ra) values within single quotes, e.g. -ra '/api/v[0-9]\.[0-9]\*'
  • Set the -o option to give a specific output file name for Links, rather than the default of output.txt. If you plan on running a large depth of searches, start with 2 with option -v to check what is being returned. Then you can increase the Depth, and the new output will be appended to the existing file, unless you pass -ow.
  • Set the -op option to give a specific output file name for Potential Parameters, rather than the default of parameters.txt. Any output will be appended to the existing file, unless you pass -ow.
  • If using a high Depth (-d) be wary of some sites using dynamic links so will it will just keep finding new ones. If no new links are being found, then xnlLinkFinder will stop searching. Providing the Stop flags (s429, s403, sTO, sCE) should also be considered.
  • If you are finding a large number of links (especially if the Depth (-d value is high), and have limited resources, the program will stop when it reaches the memory Threshold (-m) value and end gracefully with data intact before getting killed.
  • If you decide to cancel xnLinkFinder (using Ctrl-C) in the middle of running, be patient and any gathered data will be saved before ending gracefully.
  • Using the -orig option will show the URL where the link was found. This can mean you have duplicate links in the output if the same link was found on multiple sources, but it will suffixed with the origin URL in square brackets.
  • When making requests, xnLinkFinder will use a random User-Agent from the current group, which defaults to desktop. If you have a target that could have different links for different user agent groups, the specify -u desktop mobile for example (separate with a space). The mobile user agent option is an combination of mobile-apple, mobile-android and mobile-windows.
  • When -i has been set to a directory, the contents of the files in the root of that directory will be searched for links. Files in sub-directories are not searched. Any files that are over the size set by -mfs (default: 500 MB) will be skipped.
  • When using the -replay-proxy option, sometimes requests can take longer. If you start seeing more Request Timeout errors (you'll see errors if you use -v or -vv options) then consider using -t to raise the timeout limit.
  • If you know a target will only have ASCII characters in links and parameters then consider passing -ascii-only. This can eliminate a number of false positives that can sometimes get returned from binary data.

Issues

If you come across any problems at all, or have ideas for improvements, please feel free to raise an issue on Github. If there is a problem, it will be useful if you can provide the exact command you ran and a detailed description of the problem. If possible, run with -vv to reproduce the problem and let me know about any error messages that are given.

TODO

  • I seem to have completed all the TODO's I originally had! If you think of any that need adding, let me know

Example output

Active link finding for a domain:

...

Piped input and output:

Good luck and good hunting! If you really love the tool (or any others), or they helped you find an awesome bounty, consider BUYING ME A COFFEE! ☕ (I could use the caffeine!)

/XNL-h4ck3r


JSubFinder - Searches Webpages For Javascript And Analyzes Them For Hidden Subdomains And Secrets


JSubFinder is a tool writtin in golang to search webpages & javascript for hidden subdomains and secrets in the given URL. Developed with BugBounty hunters in mind JSubFinder takes advantage of Go's amazing performance allowing it to utilize large data sets & be easily chained with other tools.


Install

Install the application and download the signatures needed to find secrets

Using GO:

go get github.com/ThreatUnkown/jsubfinder
wget https://raw.githubusercontent.com/ThreatUnkown/jsubfinder/master/.jsf_signatures.yaml && mv .jsf_signatures.yaml ~/.jsf_signatures.yaml

or

Downloads Page

Basic Usage

Search

Search the given url's for subdomains and secrets

$ jsubfinder search -h

Execute the command specified

Usage:
JSubFinder search [flags]

Flags:
-c, --crawl Enable crawling
-g, --greedy Check all files for URL's not just Javascript
-h, --help help for search
-f, --inputFile string File containing domains
-t, --threads int Ammount of threads to be used (default 5)
-u, --url strings Url to check

Global Flags:
-d, --debug Enable debug mode. Logs are stored in log.info
-K, --nossl Skip SSL cert verification (default true)
-o, --outputFile string name/location to store the file
-s, --secrets Check results for secrets e.g api keys
--sig string Location of signatures for finding secrets
-S, --silent Disable printing to the console

Examples (results are the same in this case):

$ jsubfinder search -u www.google.com
$ jsubfinder search -f file.txt
$ echo www.google.com | jsubfinder search
$ echo www.google.com | httpx --silent | jsubfinder search$

apis.google.com
ogs.google.com
store.google.com
mail.google.com
accounts.google.com
www.google.com
policies.google.com
support.google.com
adservice.google.com
play.google.com

With Secrets Enabled

note --secrets="" will save the secret results in a secrets.txt file

$ echo www.youtube.com | jsubfinder search --secrets=""
www.youtube.com
youtubei.youtube.com
payments.youtube.com
2Fwww.youtube.com
252Fwww.youtube.com
m.youtube.com
tv.youtube.com
music.youtube.com
creatoracademy.youtube.com
artists.youtube.com

Google Cloud API Key <redacted> found in content of https://www.youtube.com
Google Cloud API Key <redacted> found in content of https://www.youtube.com
Google Cloud API Key <redacted> found in content of https://www.youtube.com
Google Cloud API Key <redacted> found in content of https://www.youtube.com
Google Cloud API Key <redacted> found in content of https://www.youtube.com
Google Cloud API Key <redacted> found in content of https://www.youtube.com

Advanced examples

$ echo www.google.com | jsubfinder search -crawl -s "google_secrets.txt" -S -o jsf_google.txt -t 10 -g
  • -crawl use the default crawler to crawl pages for other URL's to analyze
  • -s enables JSubFinder to search for secrets
  • -S Silence output to console
  • -o <file> save output to specified file
  • -t 10 use 10 threads
  • -g search every URL for JS, even ones we don't think have any

Proxy

Enables the upstream HTTP proxy with TLS MITM sypport. This allows you to:

  1. Browse sites in realtime and have JSubFinder search for subdomains and secrets real time.
  2. If needed run jsubfinder on another server to offload the workload
$ JSubFinder proxy -h

Execute the command specified

Usage:
JSubFinder proxy [flags]

Flags:
-h, --help help for proxy
-p, --port int Port for the proxy to listen on (default 8444)
--scope strings Url's in scope seperated by commas. e.g www.google.com,www.netflix.com
-u, --upstream-proxy string Adress of upsteam proxy e.g http://127.0.0.1:8888 (default "http://127.0.0.1:8888")

Global Flags:
-d, --debug Enable debug mode. Logs are stored in log.info
-K, --nossl Skip SSL cert verification (default true)
-o, --outputFile string name/location to store the file
-s, --secrets Check results for secrets e.g api keys
--sig string Location of signatures for finding secrets
-S, --silent Disable printing to the console
$ jsubfinder proxy
Proxy started on :8444
Subdomain: out.reddit.com
Subdomain: www.reddit.com
Subdomain: 2Fwww.reddit.com
Subdomain: alb.reddit.com
Subdomain: about.reddit.com

With Burp Suite

  1. Configure Burp Suite to forward traffic to an upstream proxy/ (User Options > Connections > Upsteam Proxy Servers > Add)
  2. Run JSubFinder in proxy mode

Burp Suite will now forward all traffic proxied through it to JSubFinder. JSubFinder will retrieve the response, return it to burp and in another thread search for subdomains and secrets.

With Proxify

  1. Launch Proxify & dump traffic to a folder proxify -output logs
  2. Configure Burp Suite, a Browser or other tool to forward traffic to Proxify (see instructions on their github page)
  3. Launch JSubFinder in proxy mode & set the upstream proxy as Proxify jsubfinder proxy -u http://127.0.0.1:8443
  4. Use Proxify's replay utility to replay the dumped traffic to jsubfinder replay -output logs -burp-addr http://127.0.0.1:8444

Run on another server

Simple, run JSubFinder in proxy mode on another server e.g 192.168.1.2. Follow the proxy steps above but set your applications upstream proxy as 192.168.1.2:8443

Advanced Examples

$ jsubfinder proxy --scope www.reddit.com -p 8081 -S -o jsf_reddit.txt
  • --scope limits JSubFinder to only analyze responses from www.reddit.com
  • -p port JSubFinders proxy server is running on
  • -S silence output to the console/stdout
  • -o <file> output examples to this file


Peetch - An eBPF Playground


peetch is a collection of tools aimed at experimenting with different aspects of eBPF to bypass TLS protocol protections.

Currently, peetch includes two subcommands. The first called dump aims to sniff network traffic by associating information about the source process with each packet. The second called tls allows to identify processes using OpenSSL to extract cryptographic keys.

Combined, these two commands make it possible to decrypt TLS exchanges recorded in the PCAPng format.


Installation

peetch relies on several dependencies including non-merged modifications of bcc and Scapy. A Docker image can be easily built in order to easily test peetch using the following command:

docker build -t quarkslab/peetch .

Commands Walk Through

The following examples assume that you used the following command to enter the Docker image and launch examples within it:

docker run --privileged --network host --mount type=bind,source=/sys,target=/sys --mount type=bind,source=/proc,target=/proc --rm -it quarkslab/peetch

dump

This sub-command gives you the ability to sniff packets using an eBPF TC classifier and to retrieve the corresponding PID and process names with:

peetch dump
curl/1289291 - Ether / IP / TCP 10.211.55.10:53052 > 208.97.177.124:https S / Padding
curl/1289291 - Ether / IP / TCP 208.97.177.124:https > 10.211.55.10:53052 SA / Padding
curl/1289291 - Ether / IP / TCP 10.211.55.10:53052 > 208.97.177.124:https A / Padding
curl/1289291 - Ether / IP / TCP 10.211.55.10:53052 > 208.97.177.124:https PA / Raw / Padding
curl/1289291 - Ether / IP / TCP 208.97.177.124:https > 10.211.55.10:53052 A / Padding

Note that for demonstration purposes, dump will only capture IPv4 based TCP segments.

For convenience, the captured packets can be store to PCAPng along with process information using --write:

peetch dump --write peetch.pcapng
^C

This PCAPng can easily be manipulated with Wireshark or Scapy:

scapy
>>> l = rdpcap("peetch.pcapng")
>>> l[0]
<Ether dst=00:1c:42:00:00:18 src=00:1c:42:54:f3:34 type=IPv4 |<IP version=4 ihl=5 tos=0x0 len=60 id=11088 flags=DF frag=0 ttl=64 proto=tcp chksum=0x4bb1 src=10.211.55.10 dst=208.97.177.124 |<TCP sport=53054 dport=https seq=631406526 ack=0 dataofs=10 reserved=0 flags=S window=64240 chksum=0xc3e9 urgptr=0 options=[('MSS', 1460), ('SAckOK', b''), ('Timestamp', (1272423534, 0)), ('NOP', None), ('WScale', 7)] |<Padding load='\x00\x00' |>>>>
>>> l[0].comment
b'curl/1289909'

tls

This sub-command aims at identifying process that uses OpenSSl and makes it is to dump several things like plaintext and secrets.

By default, peetch tls will only display one line per process, the --directions argument makes it possible to display the exchanges messages:

peetch tls --directions
<- curl (1291078) 208.97.177.124/443 TLS1.2 ECDHE-RSA-AES128-GCM-SHA256
> curl (1291078) 208.97.177.124/443 TLS1.-1 ECDHE-RSA-AES128-GCM-SHA256

Displaying OpenSSL buffer content is achieved with --content.

peetch tls --content
<- curl (1290608) 208.97.177.124/443 TLS1.2 ECDHE-RSA-AES128-GCM-SHA256

0000 47 45 54 20 2F 20 48 54 54 50 2F 31 2E 31 0D 0A GET / HTTP/1.1..
0010 48 6F 73 74 3A 20 77 77 77 2E 70 65 72 64 75 2E Host: www.perdu.
0020 63 6F 6D 0D 0A 55 73 65 72 2D 41 67 65 6E 74 3A com..User-Agent:
0030 20 63 75 72 6C 2F 37 2E 36 38 2E 30 0D 0A 41 63 curl/7.68.0..Ac

-> curl (1290608) 208.97.177.124/443 TLS1.-1 ECDHE-RSA-AES128-GCM-SHA256

0000 48 54 54 50 2F 31 2E 31 20 32 30 30 20 4F 4B 0D HTTP/1.1 200 OK.
0010 0A 44 61 74 65 3A 20 54 68 75 2C 20 31 39 20 4D .Date: Thu, 19 M
0020 61 79 20 32 30 32 32 20 31 38 3A 31 36 3A 30 31 ay 2022 18:16:01
0030 20 47 4D 54 0D 0A 53 65 72 76 65 72 3A 20 41 70 GMT..Server: Ap

The --secrets arguments will display TLS Master Secrets extracted from memory. The following example leverages --write to write master secrets to discuss to simplify decruypting TLS messages with Scapy:

$ (sleep 5; curl https://www.perdu.com/?name=highly%20secret%20information --tls-max 1.2 -http1.1) &

# peetch tls --write &
curl (1293232) 208.97.177.124/443 TLS1.2 ECDHE-RSA-AES128-GCM-SHA256

# peetch dump --write traffic.pcapng
^C

# Add the master secret to a PCAPng file
$ editcap --inject-secrets tls,1293232-master_secret.log traffic.pcapng traffic-ms.pcapng

$ scapy
>>> load_layer("tls")
>>> conf.tls_session_enable = True
>>> l = rdpcap("traffic-ms.pcapng")
>>> l[13][TLS].msg
[<TLSApplicationData data='GET /?name=highly%20secret%20information HTTP/1.1\r\nHost: www.perdu.com\r\nUser-Agent: curl/7.68.0\r\nAccept: */*\r\n\r\n' |>]

Limitations

By design, peetch only supports OpenSSL and TLS 1.2.



GnuTLS patches memory mismanagement bug – update now!

GnuTLS may well be the most widespread cryptographic toolkit you've never heard of. Learn more...

PR-DNSd - Passive-Recursive DNS Daemon


Passive-Recursive DNS daemon.


Quickstart

nameserver 127.0.0.1 | sudo tee /etc/resolv.conf dig google.com dig -x $(dig +short google.com)">
go get github.com/korc/PR-DNSd
sudo setcap cap_net_bind_service,cap_sys_chroot=ep go/bin/PR-DNSd
go/bin/PR-DNSd -upstream 9.9.9.9:53 -listen 127.0.0.1:53
echo nameserver 127.0.0.1 | sudo tee /etc/resolv.conf
dig google.com
dig -x $(dig +short google.com)

If you can't use setcap, you have to use -chroot "" and -listen :<high_port> options, or run as root.

Use cases

  • run as local host DNS service, to fix your netstat/tcpview/lsof etc. output
  • as enterprise-internal DNS server, to also be able to do meaningful EDR/IR and log analysis
  • as cloud service, to also collect Passive DNS data from non-enterprise (home, BYOD etc.) devices
    • hint: you probably want to configure DDoS protection options
  • in cloud as DNS-over-TLS server, to additionally provide private DNS for supporting devices (ex: Android 9's private DNS setting)
    • ex: domain pattern based firewall/proxy configuration for mobile devices

Running as your own private server for Android9's Private DNS settings

After appropriate setcap, run:

PR-DNSd -tlslisten :853 -cert YOUR_SERVER_CRT_KEY_PEM -upstream 1.1.1.1:53 -store pr-dnsd

Options

-cert string
TCP-TLS listener certificate (required for tls listener)
-chroot string
chroot to directory after start (default "/var/tmp")
-count int
Count of replies allowed before debounce delay is applied (default 100)
-ctmout string
Client timeout for upstream queries
-debounce string
Required time duration between UDP replies to single IP to prevent DoS (default "200ms")
-key string
TCP-TLS certificate key (default same as -cert value)
-listen string
listen address (default ":53")
-silent
Don't report normal data
-store string
Store PTR data to specified file
-tlslisten string
TCP-TLS listener address (default ":853")
-upstream string
upstream DNS serv er (tcp-tls:// prefix for DoT) (default "1.1.1.1:53")
(with tls and chroot, ensure ca-certificates and resolv.conf in chroot are properly set up)


Bypass-Url-Parser - Tool That Tests Many URL Bypasses To Reach A 40X Protected Page


Tool that tests MANY url bypasses to reach a 40X protected page.

If you wonder why this code is nothing but a dirty curl wrapper, here's why:

  • Most of the python requests do url/path/parameter encoding/decoding, and I hate this.
  • If I submit raw chars, I want raw chars to be sent.
  • If I send a weird path, I want it weird, not normalized.

This is surprisingly hard to achieve in python without loosing all of the lib goodies like parsing, ssl/tls encapsulation and so on.
So, be like me, use curl as a backend, it's gonna be just fine.


Setup for bypass.py

# Deps
sudo apt install -y bat curl virtualenv python3
# Tool
virtualenv -p python3 .py3
source .py3/bin/activate
pip install -r requirements.txt
./bypass-url-parser.py --url "http://127.0.0.1/juicy_403_endpoint/"

Usage

Expected result
2022-05-10 15:54:03 work bup[738125] INFO === Config ===
2022-05-10 15:54:03 work bup[738125] INFO debug: False
2022-05-10 15:54:03 work bup[738125] INFO url: http://thinkloveshare.com/api/jolokia/list
2022-05-10 15:54:03 work bup[738125] INFO outdir: /tmp/tmp48drf_ie-bypass-url-parser
2022-05-10 15:54:03 work bup[738125] INFO threads: 20
2022-05-10 15:54:03 work bup[738125] INFO timeout: 2
2022-05-10 15:54:03 work bup[738125] INFO headers: {}
2022-05-10 15:54:03 work bup[738125] WARNING Stage: generate_curls
2022-05-10 15:54:03 work bup[738125] INFO base_url: http://thinkloveshare.com
2022-05-10 15:54:03 work bup[738125] INFO base_path: /api/jolokia/list
2022-05-10 15:54:03 work bup[738125] WARNING Stage: run_curls
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64 ) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'CONNECT' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'GET' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 S afari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'LOCK' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'OPTIONS' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'PATCH' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: % {size_download}' -X 'POST' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'POUET' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'PUT' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'TRACE' 'http://thinkloveshare.com/ api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'TRACK' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -X 'UPDATE' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -H 'Access-Control-Allow-Origin: 0.0.0.0' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -H 'Access-Control-Allow-Origin: 127.0.0.1' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -H 'Access-Control-Allow-Origin: localhost' 'http://thinkloveshare.com/api/jolokia/list'
2022-05-10 15:54:03 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' -H 'Access-Control-Allow-Origin: norealhost' 'http://thinkloveshare.com/api/jolokia/list'
[...]
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%252f%252f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%26//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2e//list 2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2e%2e//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2e%2e///list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2e%2e%2f//list'
2022-05-10 15:54:09 work bup[738125] INFO Curren t: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f///list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f%20%23//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f%23//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f%2f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f%3b%2f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f%3b%2f%2f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f%3f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%2f%3f///list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w ' \nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b/..//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b//%2f..///list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'ht tp://thinkloveshare.com//api/jolokia//%3b/%2e.//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b/%2e%2e/..%2f%2f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b/%2f%2f..///list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolo kia//%3b%09//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b%2f..//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b%2f%2e.//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b%2f%2e%2e//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3b%2f%2e%2e%2f%2e%2e%2f%2f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3f//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl -sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3f%23//list'
2022-05-10 15:54:09 work bup[738125] INFO Current: curl - sS -kgi --path-as-is -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.41 Safari/537.36' -w '\nStatus: %{http_code}, Length: %{size_download}' 'http://thinkloveshare.com//api/jolokia//%3f%3f//list'
2022-05-10 15:54:09 work bup[738125] WARNING Stage: save_and_quit
2022-05-10 15:54:10 work bup[738125] INFO Saving html pages and short output in: /tmp/tmp48drf_ie-bypass-url-parser
2022-05-10 15:54:10 work bup[738125] INFO Triaged results shows the following distinct pages:
9: 41 - 850a2bd214c68f582aaac1c84c702b5d.html
10: 97 - 219145da181c48fea603aab3097d8201.html
10: 99 - 309b8397d07f618ec07541c418979a84.html
10: 100 - 9a1304f66bfee2130b34258635d50171.html
10: 108 - b61052875693afa4b86d39321d4170b4.html
10: 109 - 6fb5c59f5c29d23e407d6f041523a2bb.html
11: 101 - 045d36e3cfba7f6cbb7e657fc6cf1125.html
12:43116 - 9787a734c56b37f7bf5d78aaee43c55d.html
1 6: 41 - c5663aedf1036c950a5d83bd83c8e4e7.html
21: 156 - 7857d3d4a9bc8bf69278bf43c4918909.html
22: 107 - 011ca570bdf2e5babcf4f99c4cd84126.html
22: 109 - 6d4b61258386f744a388d402a5f11d03.html
22: 110 - 2f26cd3ba49e023dbda4453e5fd89431.html
76: 821 - bfe5f92861f949e44b355ee22574194a.html
2022-05-10 15:54:10 work bup[738125] INFO Also, inspect them manually with batcat:
echo /tmp/tmp48drf_ie-bypass-url-parser/{850a2bd214c68f582aaac1c84c702b5d.html,219145da181c48fea603aab3097d8201.html,309b8397d07f618ec07541c418979a84.html,9a1304f66bfee2130b34258635d50171.html,b61052875693afa4b86d39321d4170b4.html,6fb5c59f5c29d23e407d6f041523a2bb.html,045d36e3cfba7f6cbb7e657fc6cf1125.html,9787a734c56b37f7bf5d78aaee43c55d.html,c5663aedf1036c950a5d83bd83c8e4e7.html,7857d3d4a9bc8bf69278bf43c4918909.html,011ca570bdf2e5babcf4f99c4cd84126.html,6d4b61258386f744a388d402a5f11d03.html,2f26cd3ba49e023dbda4453e5fd89431.html,bfe5f92861f949e44b355ee22574194a.html} | xa rgs bat


❌