SQLMC (SQL Injection Massive Checker) is a tool designed to scan a domain for SQL injection vulnerabilities. It crawls the given URL up to a specified depth, checks each link for SQL injection vulnerabilities, and reports its findings.
bash pip3 install sqlmc
Run sqlmc
with the following command-line arguments:
-u, --url
: The URL to scan (required)-d, --depth
: The depth to scan (required)-o, --output
: The output file to save the resultsExample usage:
sqlmc -u http://example.com -d 2
Replace http://example.com with the URL you want to scan and 3 with the desired depth of the scan. You can also specify an output file using the -o or --output flag followed by the desired filename.
The tool will then perform the scan and display the results.
This project is licensed under the GNU Affero General Public License v3.0.
mapXplore is a modular application that imports data extracted of the sqlmap to PostgreSQL or SQLite database.
Its main features are:
Automatic export of information stored in base64, such as:
Filter tables and columns by criteria.
git clone https://github.com/daniel2005d/mapXplore
cd mapXplore
pip install -r requirements
It is a modular application, and consists of the following:
Allows loading a default configuration at the start of the program
python engine.py [--config config.json]
SqliSniper is a robust Python tool designed to detect time-based blind SQL injections in HTTP request headers. It enhances the security assessment process by rapidly scanning and identifying potential vulnerabilities using multi-threaded, ensuring speed and efficiency. Unlike other scanners, SqliSniper is designed to eliminates false positives through and send alerts upon detection, with the built-in Discord notification functionality.
git clone https://github.com/danialhalo/SqliSniper.git
cd SqliSniper
chmod +x sqlisniper.py
pip3 install -r requirements.txt
This will display help for the tool. Here are all the options it supports.
ubuntu:~/sqlisniper$ ./sqlisniper.py -h
ββββββββ βββββββ βββ βββ ββββββββββββ βββββββββββββ βββββββββββββββ
ββββββββββββββββββββ βββ βββββββββββββ ββββββββββββββββββββββββββββββ
ββββββββββ ββββββ βββ ββββββββββββββ ββββββββββββββββββββ ββββββββ
βββββββββββββ ββββββ βββ ββββββββββββββββββββββββββββ ββββββ ββββββββ
βββββββββββ ββββββββββββββββ βββββββββββ ββββββββββββ βββββββββββ βββ
ββββββββ βββββββ βββββββββββ βββββββββββ βββββββββββ βββββββββββ βββ
-: By Muhammad Danial :-
usage: sqlisniper.py [-h] [-u URL] [-r URLS_FILE] [-p] [--proxy PROXY] [--payload PA YLOAD] [--single-payload SINGLE_PAYLOAD] [--discord DISCORD] [--headers HEADERS]
[--threads THREADS]
Detect SQL injection by sending malicious queries
options:
-h, --help show this help message and exit
-u URL, --url URL Single URL for the target
-r URLS_FILE, --urls_file URLS_FILE
File containing a list of URLs
-p, --pipeline Read from pipeline
--proxy PROXY Proxy for intercepting requests (e.g., http://127.0.0.1:8080)
--payload PAYLOAD File containing malicious payloads (default is payloads.txt)
--single-payload SINGLE_PAYLOAD
Single payload for testing
--discord DISCORD Discord Webhook URL
--headers HEADERS File containing headers (default is headers.txt)
--threads THREADS Number of threads
The url can be provided with -u flag
for single site scan
./sqlisniper.py -u http://example.com
The -r flag
allows SqliSniper to read a file containing multiple URLs for simultaneous scanning.
./sqlisniper.py -r url.txt
The SqliSniper can also worked with the pipeline input with -p flag
cat url.txt | ./sqlisniper.py -p
The pipeline feature facilitates seamless integration with other tools. For instance, you can utilize tools like subfinder and httpx, and then pipe their output to SqliSniper for mass scanning.
subfinder -silent -d google.com | sort -u | httpx -silent | ./sqlisniper.py -p
By default the SqliSniper use the payloads.txt file. However --payload flag
can be used for providing custom payloads file.
./sqlisniper.py -u http://example.com --payload mssql_payloads.txt
While using the custom payloads file, ensure that you substitute the sleep time with %__TIME_OUT__%
. SqliSniper dynamically adjusts the sleep time iteratively to mitigate potential false positives. The payloads file should look like this.
ubuntu:~/sqlisniper$ cat payloads.txt
0\"XOR(if(now()=sysdate(),sleep(%__TIME_OUT__%),0))XOR\"Z
"0"XOR(if(now()=sysdate()%2Csleep(%__TIME_OUT__%)%2C0))XOR"Z"
0'XOR(if(now()=sysdate(),sleep(%__TIME_OUT__%),0))XOR'Z
If you want to only test with the single payload --single-payload flag
can be used. Make sure to replace the sleep time with %__TIME_OUT__%
./sqlisniper.py -r url.txt --single-payload "0'XOR(if(now()=sysdate(),sleep(%__TIME_OUT__%),0))XOR'Z"
Headers are saved in the file headers.txt for scanning custom header save the custom HTTP Request Header in headers.txt file.
ubuntu:~/sqlisniper$ cat headers.txt
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64)
X-Forwarded-For: 127.0.0.1
SqliSniper also offers Discord alert notifications, enhancing its functionality by providing real-time alerts through Discord webhooks. This feature proves invaluable during large-scale scans, allowing prompt notifications upon detection.
./sqlisniper.py -r url.txt --discord <web_hookurl>
Threads can be defined with --threads flag
./sqlisniper.py -r url.txt --threads 10
Note: It is crucial to consider that employing a higher number of threads might lead to potential false positives or overlooking valid issues. Due to the nature of time-based SQL injection it is recommended to use lower thread for more accurate detection.
SqliSniper
is made inΒ pythonΒ with lots of <3 by @Muhammad Danial.
A Powerful Sensor Tool to discover login panels, and POST Form SQLi Scanning
Features
so the script is super fast at scanning many urls
quick tutorial & screenshots are shown at the bottom
project contribution tips at the bottom
Β
Installation
git clone https://github.com/Mr-Robert0/Logsensor.git
cd Logsensor && sudo chmod +x logsensor.py install.sh
pip install -r requirements.txt
./install.sh
Dependencies
Β
1. Multiple hosts scanning to detect login panels
python3 logsensor.py -f <subdomains-list>
python3 logsensor.py -f <subdomains-list> -t 50
python3 logsensor.py -f <subdomains-list> --login
2. Targeted SQLi form scanning
python logsensor.py -u www.example.com/login --sqli
python logsensor.py -u www.example.com/login -s --proxy http://127.0.0.1:8080
python logsensor.py -u www.example.com/login -s --inputname email
View help
python logsensor.py --help
usage: logsensor.py [-h --help] [--file ] [--url ] [--proxy] [--login] [--sqli] [--threads]
optional arguments:
-u , --url Target URL (e.g. http://example.com/ )
-f , --file Select a target hosts list file (e.g. list.txt )
--proxy Proxy (e.g. http://127.0.0.1:8080)
-l, --login run only Login panel Detector Module
-s, --sqli run only POST Form SQLi Scanning Module with provided Login panels Urls
-n , --inputname Customize actual username input for SQLi scan (e.g. 'username' or 'email')
-t , --threads Number of threads (default 30)
-h, --help Show this help message and exit
TODO
KnowsMore officially supports Python 3.8+.
knowsmore --stats
This command will produce several statistics about the passwords like the output bellow
KnowsMore v0.1.4 by Helvio Junior
Active Directory, BloodHound, NTDS hashes and Password Cracks correlation tool
https://github.com/helviojunior/knowsmore
[+] Startup parameters
command line: knowsmore --stats
module: stats
database file: knowsmore.db
[+] start time 2023-01-11 03:59:20
[?] General Statistics
+-------+----------------+-------+
| top | description | qty |
|-------+----------------+-------|
| 1 | Total Users | 95369 |
| 2 | Unique Hashes | 74299 |
| 3 | Cracked Hashes | 23177 |
| 4 | Cracked Users | 35078 |
+-------+----------------+-------+
[?] General Top 10 passwords
+-------+-------------+-------+
| top | password | qty |
|-------+-------------+-------|
| 1 | password | 1111 |
| 2 | 123456 | 824 |
| 3 | 123456789 | 815 |
| 4 | guest | 553 |
| 5 | qwerty | 329 |
| 6 | 12345678 | 277 |
| 7 | 111111 | 268 |
| 8 | 12345 | 202 |
| 9 | secret | 170 |
| 10 | sec4us | 165 |
+-------+-------------+-------+
[?] Top 10 weak passwords by company name similarity
+-------+--------------+---------+----------------------+-------+
| top | password | score | company_similarity | qty |
|-------+--------------+---------+----------------------+-------|
| 1 | company123 | 7024 | 80 | 1111 |
| 2 | Company123 | 5209 | 80 | 824 |
| 3 | company | 3674 | 100 | 553 |
| 4 | Company@10 | 2080 | 80 | 329 |
| 5 | company10 | 1722 | 86 | 268 |
| 6 | Company@2022 | 1242 | 71 | 202 |
| 7 | Company@2024 | 1015 | 71 | 165 |
| 8 | Company2022 | 978 | 75 | 157 |
| 9 | Company10 | 745 | 86 | 116 |
| 10 | Company21 | 707 | 86 | 110 |
+-------+--------------+---------+----------------------+-------+
pip3 install --upgrade knowsmore
Note: If you face problem with dependency version Check the Virtual ENV file
There is no an obligation order to import data, but to get better correlation data we suggest the following execution flow:
All data are stored in a SQLite Database
knowsmore --create-db
We can import all full BloodHound files into KnowsMore, correlate data, and sync it to Neo4J BloodHound Database. So you can use only KnowsMore to import JSON files directly into Neo4j database instead of use extremely slow BloodHound User Interface
# Bloodhound ZIP File
knowsmore --bloodhound --import-data ~/Desktop/client.zip
# Bloodhound JSON File
knowsmore --bloodhound --import-data ~/Desktop/20220912105336_users.json
Note: The KnowsMore is capable to import BloodHound ZIP File and JSON files, but we recommend to use ZIP file, because the KnowsMore will automatically order the files to better data correlation.
# Bloodhound ZIP File
knowsmore --bloodhound --sync 10.10.10.10:7687 -d neo4j -u neo4j -p 12345678
Note: The KnowsMore implementation of bloodhount-importer was inpired from Fox-It BloodHound Import implementation. We implemented several changes to save all data in KnowsMore SQLite database and after that do an incremental sync to Neo4J database. With this strategy we have several benefits such as at least 10x faster them original BloodHound User interface.
Note: Import hashes and clear-text passwords directly from NTDS.dit and SYSTEM registry
knowsmore --secrets-dump -target LOCAL -ntds ~/Desktop/ntds.dit -system ~/Desktop/SYSTEM
Note: First use the secretsdump to extract ntds hashes with the command bellow
secretsdump.py -ntds ntds.dit -system system.reg -hashes lmhash:ntlmhash LOCAL -outputfile ~/Desktop/client_name
After that import
knowsmore --ntlm-hash --import-ntds ~/Desktop/client_name.ntds
knowsmore --word-list -o "~/Desktop/Wordlist/my_custom_wordlist.txt" --batch --name company_name
First extract all hashes to a txt file
# Extract NTLM hashes to file
nowsmore --ntlm-hash --export-hashes "~/Desktop/ntlm_hash.txt"
# Or, extract NTLM hashes from NTDS file
cat ~/Desktop/client_name.ntds | cut -d ':' -f4 > ntlm_hashes.txt
In order to crack the hashes, I usually use hashcat
with the command bellow
# Wordlist attack
hashcat -m 1000 -a 0 -O -o "~/Desktop/cracked.txt" --remove "~/Desktop/ntlm_hash.txt" "~/Desktop/Wordlist/*"
# Mask attack
hashcat -m 1000 -a 3 -O --increment --increment-min 4 -o "~/Desktop/cracked.txt" --remove "~/Desktop/ntlm_hash.txt" ?a?a?a?a?a?a?a?a
knowsmore --ntlm-hash --company clientCompanyName --import-cracked ~/Desktop/cracked.txt
Note: Change clientCompanyName to name of your company
As the passwords and his hashes are extremely sensitive data, there is a module to replace the clear text passwords and respective hashes.
Note: This command will keep all generated statistics and imported user data.
knowsmore --wipe
During the assessment you can find (in a several ways) users password, so you can add this to the Knowsmore database
knowsmore --user-pass --username administrator --password Sec4US@2023
# or adding the company name
knowsmore --user-pass --username administrator --password Sec4US@2023 --company sec4us
Integrate all credentials cracked to Neo4j Bloodhound database
knowsmore --bloodhound --mark-owned 10.10.10.10 -d neo4j -u neo4j -p 123456
To remote connection make sure that Neo4j database server is accepting remote connection. Change the line bellow at the config file /etc/neo4j/neo4j.conf and restart the service.
server.bolt.listen_address=0.0.0.0:7687
HBSQLI is an automated command-line tool for performing Header Based Blind SQL injection attacks on web applications. It automates the process of detecting Header Based Blind SQL injection vulnerabilities, making it easier for security researchers , penetration testers & bug bounty hunters to test the security of web applications.Β
This tool is intended for authorized penetration testing and security assessment purposes only. Any unauthorized or malicious use of this tool is strictly prohibited and may result in legal action.
The authors and contributors of this tool do not take any responsibility for any damage, legal issues, or other consequences caused by the misuse of this tool. The use of this tool is solely at the user's own risk.
Users are responsible for complying with all applicable laws and regulations regarding the use of this tool, including but not limited to, obtaining all necessary permissions and consents before conducting any testing or assessment.
By using this tool, users acknowledge and accept these terms and conditions and agree to use this tool in accordance with all applicable laws and regulations.
Install HBSQLI with following steps:
$ git clone https://github.com/SAPT01/HBSQLI.git
$ cd HBSQLI
$ pip3 install -r requirements.txt
usage: hbsqli.py [-h] [-l LIST] [-u URL] -p PAYLOADS -H HEADERS [-v]
options:
-h, --help show this help message and exit
-l LIST, --list LIST To provide list of urls as an input
-u URL, --url URL To provide single url as an input
-p PAYLOADS, --payloads PAYLOADS
To provide payload file having Blind SQL Payloads with delay of 30 sec
-H HEADERS, --headers HEADERS
To provide header file having HTTP Headers which are to be injected
-v, --verbose Run on verbose mode
$ python3 hbsqli.py -u "https://target.com" -p payloads.txt -H headers.txt -v
$ python3 hbsqli.py -l urls.txt -p payloads.txt -H headers.txt -v
There are basically two modes in this, verbose which will show you all the process which is happening and show your the status of each test done and non-verbose, which will just print the vulnerable ones on the screen. To initiate the verbose mode just add -v in your command
You can use the provided payload file or use a custom payload file, just remember that delay in each payload in the payload file should be set to 30 seconds.
You can use the provided headers file or even some more custom header in that file itself according to your need.
ICMP Packet Sniffer is a Python program that allows you to capture and analyze ICMP (Internet Control Message Protocol) packets on a network interface. It provides detailed information about the captured packets, including source and destination IP addresses, MAC addresses, ICMP type, payload data, and more. The program can also store the captured packets in a SQLite database and save them in a pcap format.
git clone https://github.com/HalilDeniz/ICMPWatch.git
pip install -r requirements.txt
python ICMPWatch.py [-h] [-v] [-t TIMEOUT] [-f FILTER] [-o OUTPUT] [--type {0,8}] [--src-ip SRC_IP] [--dst-ip DST_IP] -i INTERFACE [-db] [-c CAPTURE]
-v
or --verbose
: Show verbose packet details.-t
or --timeout
: Sniffing timeout in seconds (default is 300 seconds).-f
or --filter
: BPF filter for packet sniffing (default is "icmp").-o
or --output
: Output file to save captured packets.--type
: ICMP packet type to filter (0: Echo Reply, 8: Echo Request).--src-ip
: Source IP address to filter.--dst-ip
: Destination IP address to filter.-i
or --interface
: Network interface to capture packets (required).-db
or --database
: Store captured packets in an SQLite database.-c
or --capture
: Capture file to save packets in pcap format.Press Ctrl+C
to stop the sniffing process.
python icmpwatch.py -i eth0
python dnssnif.py -i eth0 -o icmp_results.txt
python icmpwatch.py -i eth0 --src-ip 192.168.1.10 --dst-ip 192.168.1.20
python icmpwatch.py -i eth0 --type 8
python icmpwatch.py -i eth0 -c captured_packets.pcap