mathebell / model-watermarkingView external linksLinks
☆16Dec 3, 2021Updated 4 years ago
Alternatives and similar repositories for model-watermarking
Users that are interested in model-watermarking are comparing it to the libraries listed below
Sorting:
- Implemention of "Piracy Resistant Watermarks for Deep Neural Networks" in TensorFlow.☆12Dec 5, 2020Updated 5 years ago
- Implementation of "Adversarial Frontier Stitching for Remote Neural Network Watermarking" in TensorFlow.☆24Aug 30, 2021Updated 4 years ago
- This repository was created as an implementation approach for a project on "Watermarking Deep Neural Networks".☆29Nov 16, 2020Updated 5 years ago
- Implementation of IEEE TNNLS 2023 and Elsevier PR 2023 papers on backdoor watermarking for deep classification models with unambiguity an…☆19Jul 27, 2023Updated 2 years ago
- The official implementation of the paper "Free Fine-tuning: A Plug-and-Play Watermarking Scheme for Deep Neural Networks".☆19Apr 19, 2024Updated last year
- ☆93Mar 23, 2021Updated 4 years ago
- ☆20Aug 7, 2023Updated 2 years ago
- The official implementation of the IEEE S&P`22 paper "SoK: How Robust is Deep Neural Network Image Classification Watermarking".☆117May 24, 2023Updated 2 years ago
- This work corroborates a run-time Trojan detection method exploiting STRong Intentional Perturbation of inputs, is a multi-domain Trojan …☆10Mar 7, 2021Updated 4 years ago
- ☆10Dec 18, 2024Updated last year
- The implement of paper "How to Prove Your Model Belongs to You: A Blind-Watermark based Framework to Protect Intellectual Property of DNN…☆25Jan 30, 2021Updated 5 years ago
- Protect your machine learning models easily and securely with watermarking 🔑☆97Apr 24, 2024Updated last year
- ☆18Nov 13, 2021Updated 4 years ago
- Watermarking against model extraction attacks in MLaaS. ACM MM 2021.☆34Jul 15, 2021Updated 4 years ago
- ☆20May 6, 2022Updated 3 years ago
- Code for identifying natural backdoors in existing image datasets.☆15Aug 24, 2022Updated 3 years ago
- Implemention of "Robust Watermarking of Neural Network with Exponential Weighting" in TensorFlow.☆13Dec 2, 2020Updated 5 years ago
- Codes for the ICLR 2022 paper: Trigger Hunting with a Topological Prior for Trojan Detection☆11Sep 19, 2023Updated 2 years ago
- [EMNLP 2022] Distillation-Resistant Watermarking (DRW) for Model Protection in NLP☆13Aug 17, 2023Updated 2 years ago
- The official implementation codes of greedy residuals for the paper Watermarking Deep Neural Networks with Greedy Residuals (ICML 2021).☆24May 21, 2022Updated 3 years ago
- Defending against Model Stealing via Verifying Embedded External Features☆38Feb 19, 2022Updated 3 years ago
- ☆24Apr 14, 2019Updated 6 years ago
- This is the source code for HufuNet. Our paper is accepted by the IEEE TDSC.☆27Aug 21, 2023Updated 2 years ago
- official implementation of Towards Robust Model Watermark via Reducing Parametric Vulnerability☆16Jun 3, 2024Updated last year
- ☆50Feb 27, 2021Updated 4 years ago
- competition☆17Aug 1, 2020Updated 5 years ago
- ☆19Mar 26, 2022Updated 3 years ago
- Data-Efficient Backdoor Attacks☆20Jun 15, 2022Updated 3 years ago
- ☆19Jun 21, 2021Updated 4 years ago
- ☆22Sep 16, 2022Updated 3 years ago
- Defending Against Backdoor Attacks Using Robust Covariance Estimation☆22Jul 12, 2021Updated 4 years ago
- This is the code repo of our Pattern Recognition journal on IPR protection of Image Captioning Models☆11Aug 29, 2023Updated 2 years ago
- RAB: Provable Robustness Against Backdoor Attacks☆39Oct 3, 2023Updated 2 years ago
- Website & Documentation: https://sbaresearch.github.io/model-watermarking/☆25Sep 22, 2023Updated 2 years ago
- ☆21Aug 10, 2022Updated 3 years ago
- A Implementation of ICCV-2021(Parallel Rectangle Flip Attack: A Query-based Black-box Attack against Object Detection)☆28Aug 27, 2021Updated 4 years ago
- ☆14Feb 26, 2025Updated 11 months ago
- ☆10Oct 31, 2022Updated 3 years ago
- [CVPR 2022] "Quarantine: Sparsity Can Uncover the Trojan Attack Trigger for Free" by Tianlong Chen*, Zhenyu Zhang*, Yihua Zhang*, Shiyu C…☆27Oct 5, 2022Updated 3 years ago