FreshRSS

๐Ÿ”’
โŒ Secure Planet Training Courses Updated For 2019 - Click Here
There are new available articles, click to refresh the page.
Before yesterdayYour RSS feeds

SwaggerSpy - Automated OSINT On SwaggerHub

By: Zion3R


SwaggerSpy is a tool designed for automated Open Source Intelligence (OSINT) on SwaggerHub. This project aims to streamline the process of gathering intelligence from APIs documented on SwaggerHub, providing valuable insights for security researchers, developers, and IT professionals.


What is Swagger?

Swagger is an open-source framework that allows developers to design, build, document, and consume RESTful web services. It simplifies API development by providing a standard way to describe REST APIs using a JSON or YAML format. Swagger enables developers to create interactive documentation for their APIs, making it easier for both developers and non-developers to understand and use the API.


About SwaggerHub

SwaggerHub is a collaborative platform for designing, building, and managing APIs using the Swagger framework. It offers a centralized repository for API documentation, version control, and collaboration among team members. SwaggerHub simplifies the API development lifecycle by providing a unified platform for API design and testing.


Why OSINT on SwaggerHub?

Performing OSINT on SwaggerHub is crucial because developers, in their pursuit of efficient API documentation and sharing, may inadvertently expose sensitive information. Here are key reasons why OSINT on SwaggerHub is valuable:

  1. Developer Oversights: Developers might unintentionally include secrets, credentials, or sensitive information in API documentation on SwaggerHub. These oversights can lead to security vulnerabilities and unauthorized access if not identified and addressed promptly.

  2. Security Best Practices: OSINT on SwaggerHub helps enforce security best practices. Identifying and rectifying potential security issues early in the development lifecycle is essential to ensure the confidentiality and integrity of APIs.

  3. Preventing Data Leaks: By systematically scanning SwaggerHub for sensitive information, organizations can proactively prevent data leaks. This is especially crucial in today's interconnected digital landscape where APIs play a vital role in data exchange between services.

  4. Risk Mitigation: Understanding that developers might forget to remove or obfuscate sensitive details in API documentation underscores the importance of continuous OSINT on SwaggerHub. This proactive approach mitigates the risk of unintentional exposure of critical information.

  5. Compliance and Privacy: Many industries have stringent compliance requirements regarding the protection of sensitive data. OSINT on SwaggerHub ensures that APIs adhere to these regulations, promoting a culture of compliance and safeguarding user privacy.

  6. Educational Opportunities: Identifying oversights in SwaggerHub documentation provides educational opportunities for developers. It encourages a security-conscious mindset, fostering a culture of awareness and responsible information handling.

By recognizing that developers can inadvertently expose secrets, OSINT on SwaggerHub becomes an integral part of the overall security strategy, safeguarding against potential threats and promoting a secure API ecosystem.


How SwaggerSpy Works

SwaggerSpy obtains information from SwaggerHub and utilizes regular expressions to inspect API documentation for sensitive information, such as secrets and credentials.


Getting Started

To use SwaggerSpy, follow these steps:

  1. Installation: Clone the SwaggerSpy repository and install the required dependencies.
git clone https://github.com/UndeadSec/SwaggerSpy.git
cd SwaggerSpy
pip install -r requirements.txt
  1. Usage: Run SwaggerSpy with the target search terms (more accurate with domains).
python swaggerspy.py searchterm
  1. Results: SwaggerSpy will generate a report containing OSINT findings, including information about the API, endpoints, and secrets.

Disclaimer

SwaggerSpy is intended for educational and research purposes only. Users are responsible for ensuring that their use of this tool complies with applicable laws and regulations.


Contribution

Contributions to SwaggerSpy are welcome! Feel free to submit issues, feature requests, or pull requests to help improve this tool.


About the Author

SwaggerSpy is developed and maintained by Alisson Moretto (UndeadSec)

I'm a passionate cyber threat intelligence pro who loves sharing insights and crafting cybersecurity tools.


TODO

Regular Expressions Enhancement
  • [ ] Review and improve existing regular expressions.
  • [ ] Ensure that regular expressions adhere to best practices.
  • [ ] Check for any potential optimizations in the regex patterns.
  • [ ] Test regular expressions with various input scenarios for accuracy.
  • [ ] Document any complex or non-trivial regex patterns for better understanding.
  • [ ] Explore opportunities to modularize or break down complex patterns.
  • [ ] Verify the regular expressions against the latest specifications or requirements.
  • [ ] Update documentation to reflect any changes made to the regular expressions.

License

SwaggerSpy is licensed under the MIT License. See the LICENSE file for details.


Thanks

Special thanks to @Liodeus for providing project inspiration through swaggerHole.



Golddigger - Search Files For Gold

By: Zion3R


Gold Digger is a simple tool used to help quickly discover sensitive information in files recursively. Originally written to assist in rapidly searching files obtained during a penetration test.


Installation

Gold Digger requires Python3.

virtualenv -p python3 .
source bin/activate
python dig.py --help

Usage

Directory to search for gold -r RECURSIVE, --recursive RECURSIVE Search directory recursively? -l LOG, --log LOG Log file to save output" dir="auto">
usage: dig.py [-h] [-e EXCLUDE] [-g GOLD] -d DIRECTORY [-r RECURSIVE] [-l LOG]

optional arguments:
-h, --help show this help message and exit
-e EXCLUDE, --exclude EXCLUDE
JSON file containing extension exclusions
-g GOLD, --gold GOLD JSON file containing the gold to search for
-d DIRECTORY, --directory DIRECTORY
Directory to search for gold
-r RECURSIVE, --recursive RECURSIVE
Search directory recursively?
-l LOG, --log LOG Log file to save output

Example Usage

Gold Digger will recursively go through all folders and files in search of content matching items listed in the gold.json file. Additionally, you can leverage an exclusion file called exclusions.json for skipping files matching specific extensions. Provide the root folder as the --directory flag.

An example structure could be:

~/Engagements/CustomerName/data/randomfiles/
~/Engagements/CustomerName/data/randomfiles2/
~/Engagements/CustomerName/data/code/

You would provide the following command to parse all 3 account reports:

python dig.py --gold gold.json --exclude exclusions.json --directory ~/Engagements/CustomerName/data/ --log Customer_2022-123_gold.log

Results

The tool will create a log file containg the scanning results. Due to the nature of using regular expressions, there may be numerous false positives. Despite this, the tool has been proven to increase productivity when processing thousands of files.

Shout-outs

Shout out to @d1vious for releasing git-wild-hunt https://github.com/d1vious/git-wild-hunt! Most of the regex in GoldDigger was used from this amazing project.



Cypherhound - Terminal Application That Contains 260+ Neo4j Cyphers For BloodHound Data Sets


A Python3 terminal application that contains 260+ Neo4j cyphers for BloodHound data sets.

Why?

BloodHound is a staple tool for every red teamer. However, there are some negative side effects based on its design. I will cover the biggest pain points I've experienced and what this tool aims to address:

  1. My tools think in lists - until my tools parse exported JSON graphs, I need graph results in a line-by-line format .txt file
  2. Copy/pasting graph results - this plays into the first but do we need to explain this one?
  3. Graphs can be too large to draw - the information contained in any graph can aid our goals as the attacker and we need to be able to view all data efficiently
  4. Manually running custom cyphers is time-consuming - let's automate it :)

This tool can also help blue teams to reveal detailed information about their Active Directory environments as well.


Features

Take back control of your BloodHound data with cypherhound!

  • 264 cyphers as of date
    • Set cyphers to search based on user input (user, group, and computer-specific)
    • User-defined regex cyphers
  • User-defined exporting of all results
    • Default export will be just end object to be used as target list with tools
    • Raw export option available in grep/cut/awk-friendly format

Installation

Make sure to have python3 installed and run:

python3 -m pip install -r requirements.txt

Usage

Start the program with: python3 cypherhound.py -u <neo4j_username> -p <neo4j_password>

Commands

The full command menu is shown below:

Command Menu
set - used to set search parameters for cyphers, double/single quotes not required for any sub-commands
sub-commands
user - the user to use in user-specific cyphers (MUST include @domain.name)
group - the group to use in group-specific cyphers (MUST include @domain.name)
computer - the computer to use in computer-specific cyphers (SHOULD include .domain.name or @domain.name)
regex - the regex to use in regex-specific cyphers
example
set user svc-test@domain.local
set group domain admins@domain.local
set computer dc01.domain.local
set regex .*((?i)web).*
run - used to run cyphers
parameters
cypher number - the number of the cypher to run
example
run 7
export - used to export cypher results to txt files
parameters
cypher number - the number of the cypher to run and then export
output filename - the number of the output file, extension not needed
raw - write raw output or just end object (optional)
example
export 31 results
export 42 results2 raw
list - used to show a list of cyphers
parameters
list type - the type of cyphers to list (general, user, group, computer, regex, all)
example
list general
list user
list group
list computer
list regex
list all
q, quit, exit - used to exit the program
clear - used to clear the terminal
help, ? - used to display this help menu

Important Notes

  • The program is configured to use the default Neo4j database and URI
  • Built for BloodHound 4.2.0, certain edges will not work for previous versions
  • Windows users must run pip3 install pyreadline3
  • Shortest paths exports are all the same (raw or not) due to their unpredictable number of nodes

Future Goals

  • Add cyphers for Azure edges

Issues and Support

Please be descriptive with any issues you decide to open and if possible provide output (if applicable).



โŒ