FreshRSS

๐Ÿ”’
โŒ Secure Planet Training Courses Updated For 2019 - Click Here
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

KnowsMore - A Swiss Army Knife Tool For Pentesting Microsoft Active Directory (NTLM Hashes, BloodHound, NTDS And DCSync)

By: Zion3R


KnowsMore officially supports Python 3.8+.

Main features

  • Import NTLM Hashes from .ntds output txt file (generated by CrackMapExec or secretsdump.py)
  • Import NTLM Hashes from NTDS.dit and SYSTEM
  • Import Cracked NTLM hashes from hashcat output file
  • Import BloodHound ZIP or JSON file
  • BloodHound importer (import JSON to Neo4J without BloodHound UI)
  • Analyse the quality of password (length , lower case, upper case, digit, special and latin)
  • Analyse similarity of password with company and user name
  • Search for users, passwords and hashes
  • Export all cracked credentials direct to BloodHound Neo4j Database as 'owned object'
  • Other amazing features...

Getting stats

knowsmore --stats

This command will produce several statistics about the passwords like the output bellow

weak passwords by company name similarity +-------+--------------+---------+----------------------+-------+ | top | password | score | company_similarity | qty | |-------+--------------+---------+----------------------+-------| | 1 | company123 | 7024 | 80 | 1111 | | 2 | Company123 | 5209 | 80 | 824 | | 3 | company | 3674 | 100 | 553 | | 4 | Company@10 | 2080 | 80 | 329 | | 5 | company10 | 1722 | 86 | 268 | | 6 | Company@2022 | 1242 | 71 | 202 | | 7 | Company@2024 | 1015 | 71 | 165 | | 8 | Company2022 | 978 | 75 | 157 | | 9 | Company10 | 745 | 86 | 116 | | 10 | Company21 | 707 | 86 | 110 | +-------+--------------+---------+----------------------+-------+ " dir="auto">
KnowsMore v0.1.4 by Helvio Junior
Active Directory, BloodHound, NTDS hashes and Password Cracks correlation tool
https://github.com/helviojunior/knowsmore

[+] Startup parameters
command line: knowsmore --stats
module: stats
database file: knowsmore.db

[+] start time 2023-01-11 03:59:20
[?] General Statistics
+-------+----------------+-------+
| top | description | qty |
|-------+----------------+-------|
| 1 | Total Users | 95369 |
| 2 | Unique Hashes | 74299 |
| 3 | Cracked Hashes | 23177 |
| 4 | Cracked Users | 35078 |
+-------+----------------+-------+

[?] General Top 10 passwords
+-------+-------------+-------+
| top | password | qty |
|-------+-------------+-------|
| 1 | password | 1111 |
| 2 | 123456 | 824 |
| 3 | 123456789 | 815 |
| 4 | guest | 553 |
| 5 | qwerty | 329 |
| 6 | 12345678 | 277 |
| 7 | 111111 | 268 |
| 8 | 12345 | 202 |
| 9 | secret | 170 |
| 10 | sec4us | 165 |
+-------+-------------+-------+

[?] Top 10 weak passwords by company name similarity
+-------+--------------+---------+----------------------+-------+
| top | password | score | company_similarity | qty |
|-------+--------------+---------+----------------------+-------|
| 1 | company123 | 7024 | 80 | 1111 |
| 2 | Company123 | 5209 | 80 | 824 |
| 3 | company | 3674 | 100 | 553 |
| 4 | Company@10 | 2080 | 80 | 329 |
| 5 | company10 | 1722 | 86 | 268 |
| 6 | Company@2022 | 1242 | 71 | 202 |
| 7 | Company@2024 | 1015 | 71 | 165 |
| 8 | Company2022 | 978 | 75 | 157 |
| 9 | Company10 | 745 | 86 | 116 |
| 10 | Company21 | 707 | 86 | 110 |
+-------+--------------+---------+----------------------+-------+

Installation

Simple

pip3 install --upgrade knowsmore

Note: If you face problem with dependency version Check the Virtual ENV file

Execution Flow

There is no an obligation order to import data, but to get better correlation data we suggest the following execution flow:

  1. Create database file
  2. Import BloodHound files
    1. Domains
    2. GPOs
    3. OUs
    4. Groups
    5. Computers
    6. Users
  3. Import NTDS file
  4. Import cracked hashes

Create database file

All data are stored in a SQLite Database

knowsmore --create-db

Importing BloodHound files

We can import all full BloodHound files into KnowsMore, correlate data, and sync it to Neo4J BloodHound Database. So you can use only KnowsMore to import JSON files directly into Neo4j database instead of use extremely slow BloodHound User Interface

# Bloodhound ZIP File
knowsmore --bloodhound --import-data ~/Desktop/client.zip

# Bloodhound JSON File
knowsmore --bloodhound --import-data ~/Desktop/20220912105336_users.json

Note: The KnowsMore is capable to import BloodHound ZIP File and JSON files, but we recommend to use ZIP file, because the KnowsMore will automatically order the files to better data correlation.

Sync data to Neo4j BloodHound database

# Bloodhound ZIP File
knowsmore --bloodhound --sync 10.10.10.10:7687 -d neo4j -u neo4j -p 12345678

Note: The KnowsMore implementation of bloodhount-importer was inpired from Fox-It BloodHound Import implementation. We implemented several changes to save all data in KnowsMore SQLite database and after that do an incremental sync to Neo4J database. With this strategy we have several benefits such as at least 10x faster them original BloodHound User interface.

Importing NTDS file

Option 1

Note: Import hashes and clear-text passwords directly from NTDS.dit and SYSTEM registry

knowsmore --secrets-dump -target LOCAL -ntds ~/Desktop/ntds.dit -system ~/Desktop/SYSTEM

Option 2

Note: First use the secretsdump to extract ntds hashes with the command bellow

secretsdump.py -ntds ntds.dit -system system.reg -hashes lmhash:ntlmhash LOCAL -outputfile ~/Desktop/client_name

After that import

knowsmore --ntlm-hash --import-ntds ~/Desktop/client_name.ntds

Generating a custom wordlist

knowsmore --word-list -o "~/Desktop/Wordlist/my_custom_wordlist.txt" --batch --name company_name

Importing cracked hashes

Cracking hashes

First extract all hashes to a txt file

# Extract NTLM hashes to file
nowsmore --ntlm-hash --export-hashes "~/Desktop/ntlm_hash.txt"

# Or, extract NTLM hashes from NTDS file
cat ~/Desktop/client_name.ntds | cut -d ':' -f4 > ntlm_hashes.txt

In order to crack the hashes, I usually use hashcat with the command bellow

# Wordlist attack
hashcat -m 1000 -a 0 -O -o "~/Desktop/cracked.txt" --remove "~/Desktop/ntlm_hash.txt" "~/Desktop/Wordlist/*"

# Mask attack
hashcat -m 1000 -a 3 -O --increment --increment-min 4 -o "~/Desktop/cracked.txt" --remove "~/Desktop/ntlm_hash.txt" ?a?a?a?a?a?a?a?a

importing hashcat output file

knowsmore --ntlm-hash --company clientCompanyName --import-cracked ~/Desktop/cracked.txt

Note: Change clientCompanyName to name of your company

Wipe sensitive data

As the passwords and his hashes are extremely sensitive data, there is a module to replace the clear text passwords and respective hashes.

Note: This command will keep all generated statistics and imported user data.

knowsmore --wipe

BloodHound Mark as owned

One User

During the assessment you can find (in a several ways) users password, so you can add this to the Knowsmore database

knowsmore --user-pass --username administrator --password Sec4US@2023

# or adding the company name

knowsmore --user-pass --username administrator --password Sec4US@2023 --company sec4us

Integrate all credentials cracked to Neo4j Bloodhound database

knowsmore --bloodhound --mark-owned 10.10.10.10 -d neo4j -u neo4j -p 123456

To remote connection make sure that Neo4j database server is accepting remote connection. Change the line bellow at the config file /etc/neo4j/neo4j.conf and restart the service.

server.bolt.listen_address=0.0.0.0:7687


ICMPWatch - ICMP Packet Sniffer

By: Zion3R


ICMP Packet Sniffer is a Python program that allows you to capture and analyze ICMP (Internet Control Message Protocol) packets on a network interface. It provides detailed information about the captured packets, including source and destination IP addresses, MAC addresses, ICMP type, payload data, and more. The program can also store the captured packets in a SQLite database and save them in a pcap format.


Features

  • Capture and analyze ICMP Echo Request and Echo Reply packets.
  • Display detailed information about each ICMP packet, including source and destination IP addresses, MAC addresses, packet size, ICMP type, and payload content.
  • Save captured packet information to a text file.
  • Store captured packet information in an SQLite database.
  • Save captured packets to a PCAP file for further analysis.
  • Support for custom packet filtering based on source and destination IP addresses.
  • Colorful console output using ANSI escape codes.
  • User-friendly command-line interface.

Requirements

  • Python 3.7+
  • scapy 2.4.5 or higher
  • colorama 0.4.4 or higher

Installation

  1. Clone this repository:
git clone https://github.com/HalilDeniz/ICMPWatch.git
  1. Install the required dependencies:
pip install -r requirements.txt

Usage

python ICMPWatch.py [-h] [-v] [-t TIMEOUT] [-f FILTER] [-o OUTPUT] [--type {0,8}] [--src-ip SRC_IP] [--dst-ip DST_IP] -i INTERFACE [-db] [-c CAPTURE]
  • -v or --verbose: Show verbose packet details.
  • -t or --timeout: Sniffing timeout in seconds (default is 300 seconds).
  • -f or --filter: BPF filter for packet sniffing (default is "icmp").
  • -o or --output: Output file to save captured packets.
  • --type: ICMP packet type to filter (0: Echo Reply, 8: Echo Request).
  • --src-ip: Source IP address to filter.
  • --dst-ip: Destination IP address to filter.
  • -i or --interface: Network interface to capture packets (required).
  • -db or --database: Store captured packets in an SQLite database.
  • -c or --capture: Capture file to save packets in pcap format.

Press Ctrl+C to stop the sniffing process.

Examples

  • Capture ICMP packets on the "eth0" interface:
python icmpwatch.py -i eth0
  • Sniff ICMP traffic on interface "eth0" and save the results to a file:
python dnssnif.py -i eth0 -o icmp_results.txt
  • Filtering by Source and Destination IP:
python icmpwatch.py -i eth0 --src-ip 192.168.1.10 --dst-ip 192.168.1.20
  • Filtering ICMP Echo Requests:
python icmpwatch.py -i eth0 --type 8
  • Saving Captured Packets
python icmpwatch.py -i eth0 -c captured_packets.pcap


Grepmarx - A Source Code Static Analysis Platform For AppSec Enthusiasts


Grepmarx is a web application providing a single platform to quickly understand, analyze and identify vulnerabilities in possibly large and unknown code bases.

Features

SAST (Static Analysis Security Testing) capabilities:

  • Multiple languages support: C/C++, C#, Go, HTML, Java, Kotlin, JavaScript, TypeScript, OCaml, PHP, Python, Ruby, Bash, Rust, Scala, Solidity, Terraform, Swift
  • Multiple frameworks support: Spring, Laravel, Symfony, Django, Flask, Node.js, jQuery, Express, Angular...
  • 1600+ existing analysis rules
  • Easily extend analysis rules using Semgrep syntax: https://semgrep.dev/editor
  • Manage rules in rule packs to tailor code scanning

SCA (Software Composition Analysis) capabilities:

  • Multiple package-dependency formats support: NPM, Maven, Gradle, Composer, pip, Gopkg, Gem, Cargo, NuPkg, CSProj, PubSpec, Cabal, Mix, Conan, Clojure, Docker, GitHub Actions, Jenkins HPI, Kubernetes
  • SBOM (Software Bill-of-Materials) generation (CycloneDX compliant)

Extra

  • Analysis workbench designed to efficiently browse scan results
  • Scan code that doesn't compile
  • Comprehensive LOC (Lines of Code) counter
  • Inspector: automatic application features discovery
  • ... and a Dark Mode

Screenshots

Scan customization Analysis workbench Rule pack edition

Execution

Grepmarx is provided with a configuration to be executed in Docker and Gunicorn.

Docker execution


Make sure you have docker-composer installed on the system, and the docker daemon is running. The application can then be easily executed in a docker container. The steps:

Get the code

$ git clone https://github.com/Orange-Cyberdefense/grepmarx.git
$ cd grepmarx

Start the app in Docker

$ sudo docker-compose pull && sudo docker-compose build && sudo docker-compose up -d

Visit http://localhost:5000 in your browser. The app should be up & running.

Note: a default user account is created on first launch (user=admin / password=admin). Change the default password immediately.

Gunicorn


Gunicorn 'Green Unicorn' is a Python WSGI HTTP Server for UNIX. A supervisor configuration file is provided to start it along with the required Celery worker (used for security scans queuing).

Install using pip

$ pip install gunicorn supervisor

Start the app using gunicorn binary

$ supervisord -c supervisord.conf

Visit http://localhost:8001 in your browser. The app should be up & running.

Note: a default user account is created on first launch (user=admin / password=admin). Change the default password immediately.

Build from sources

Get the code

$ git clone https://github.com/Orange-Cyberdefense/grepmarx.git
$ cd grepmarx

Install virtualenv modules

$ virtualenv env
$ source env/bin/activate

Install Python modules

PostgreSQL connector (Production) $ # pip install -r requirements-pgsql.txt" dir="auto">
$ # SQLite Database (Development)
$ pip3 install -r requirements.txt
$ # OR with PostgreSQL connector (Production)
$ # pip install -r requirements-pgsql.txt

Install additionnal requirements

# Dependency scan (cdxgen / depscan) requirements
$ sudo apt install npm openjdk-17-jdk maven gradle golang composer
$ sudo npm install -g @cyclonedx/cdxgen
$ pip install appthreat-depscan

A Redis server is required to queue security scans. Install the redis package with your favorite distro package manager, then:

$ redis-server

Set the FLASK_APP environment variable

$ export FLASK_APP=run.py
$ # Set up the DEBUG environment
$ # export FLASK_ENV=development

Start the celery worker process

$ celery -A app.celery_worker.celery worker --pool=prefork --loglevel=info --detach

Start the application (development mode)

$ # --host=0.0.0.0 - expose the app on all network interfaces (default 127.0.0.1)
$ # --port=5000 - specify the app port (default 5000)
$ flask run --host=0.0.0.0 --port=5000

Access grepmarx in browser: http://127.0.0.1:5000/

Note: a default user account is created on first launch (user=admin / password=admin). Change the default password immediately.

Credits & Links



Grepmarx - Provided by Orange Cyberdefense.



โŒ