JosephTLucas / HackThisAI
Adversarial Machine Learning (AML) Capture the Flag (CTF)
☆100Updated last year
Alternatives and similar repositories for HackThisAI:
Users that are interested in HackThisAI are comparing it to the libraries listed below
- CTF challenges designed and implemented in machine learning applications☆144Updated 7 months ago
- Learn AI security through a series of vulnerable LLM CTF challenges. No sign ups, no cloud fees, run everything locally on your system.☆279Updated 8 months ago
- A LLM explicitly designed for getting hacked☆147Updated last year
- Data Scientists Go To Jupyter☆62Updated last month
- ☆127Updated 5 months ago
- Tree of Attacks (TAP) Jailbreaking Implementation☆106Updated last year
- Payloads for Attacking Large Language Models☆79Updated 9 months ago
- Delving into the Realm of LLM Security: An Exploration of Offensive and Defensive Tools, Unveiling Their Present Capabilities.☆162Updated last year
- Challenge repository for the 2023 CSAW CTF Qualifiers☆30Updated last year
- XBOW Validation Benchmarks☆84Updated 7 months ago
- PhD/MSc course on Machine Learning Security (Univ. Cagliari)☆209Updated 4 months ago
- LLM Testing Findings Templates☆70Updated last year
- ☆265Updated last year
- An environment for testing AI agents against networks using Metasploit.☆42Updated 2 years ago
- CALDERA plugin for adversary emulation of AI-enabled systems☆95Updated last year
- using ML models for red teaming☆43Updated last year
- Collection of writeups on ICS/SCADA security.☆170Updated last month
- source code for the offsecml framework☆38Updated 10 months ago
- A collection of awesome resources related AI security☆206Updated this week
- A research project to add some brrrrrr to Burp☆155Updated 2 months ago
- All things specific to LLM Red Teaming Generative AI☆24Updated 6 months ago
- Official writeups for Hack The Boo CTF 2023☆44Updated 4 months ago
- Official writeups for Business CTF 2024: The Vault Of Hope☆146Updated 4 months ago
- A very simple open source implementation of Google's Project Naptime☆141Updated 3 weeks ago
- Awesome products for securing AI systems includes open source and commercial options and an infographic licensed CC-BY-SA-4.0.☆62Updated 10 months ago
- The IoT Security Testing Guide (ISTG) provides a comprehensive methodology for penetration tests in the IoT field, offering flexibility t…☆99Updated 7 months ago
- Code repository for "Machine Learning For Red Team Hackers".☆32Updated 5 years ago
- Docker image for attacking cryptography CTF challenges☆100Updated 11 months ago
- Hack-A-Sat Qualifiers Writeups☆250Updated 2 years ago
- The DFRWS 2023 challenge (The Troubled Elevator) takes a deep dive into the domain of Industrial Control Systems (ICS), specifically foc…☆46Updated 11 months ago