usnistgov / Differential-Privacy-Synthetic-Data-Challenge-assets
This repository contains all public data, python scripts, and documentation relating to NIST Public Safety Communications Research Division's Differential Privacy program including past prize challenges and bechmark problem sets.
☆11Updated 2 years ago
Alternatives and similar repositories for Differential-Privacy-Synthetic-Data-Challenge-assets
Users that are interested in Differential-Privacy-Synthetic-Data-Challenge-assets are comparing it to the libraries listed below
Sorting:
- ☆14Updated 2 months ago
- ☆43Updated 3 years ago
- An implementation of "Data Synthesis via Differentially Private Markov Random Fields"☆13Updated last year
- Analytic calibration for differential privacy with Gaussian perturbations☆47Updated 6 years ago
- Code for Canonne-Kamath-Steinke paper https://arxiv.org/abs/2004.00010☆61Updated 4 years ago
- This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning.☆26Updated 4 years ago
- Implementation of calibration bounds for differential privacy in the shuffle model☆22Updated 4 years ago
- Heterogeneous Gaussian Mechanism: Preserving Differential Privacy in Deep Learning with Provable Robustness (IJCAI'19).☆13Updated 4 years ago
- Secure Aggregation for FL☆35Updated last year
- Implementations of differentially private release mechanisms for graph statistics☆23Updated 3 years ago
- Python package for simple implementations of state-of-the-art LDP frequency estimation algorithms. Contains code for our VLDB 2021 Paper.☆74Updated last year
- ☆66Updated 5 years ago
- ☆20Updated 2 years ago
- Code for Exploiting Unintended Feature Leakage in Collaborative Learning (in Oakland 2019)☆53Updated 5 years ago
- Unified, Simplified, Tight and Fast Privacy Amplification in the Shuffle Model of Differential Privacy☆11Updated 6 months ago
- A secure aggregation system for private federated learning☆41Updated last year
- This project's goal is to evaluate the privacy leakage of differentially private machine learning models.☆134Updated 2 years ago
- ☆28Updated 2 years ago
- Eluding Secure Aggregation in Federated Learning via Model Inconsistency☆12Updated 2 years ago
- A simple Python implementation of a secure aggregation protocole for federated learning.☆35Updated 2 years ago
- ☆9Updated 3 years ago
- ☆38Updated 2 years ago
- Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing☆51Updated 3 years ago
- A machine-learning-based tool for discovering differential privacy violations in black-box algorithms.☆25Updated 2 years ago
- Repository for collection of research papers on multi-party learning.☆32Updated last year
- A fast algorithm to optimally compose privacy guarantees of differentially private (DP) mechanisms to arbitrary accuracy.☆73Updated last year
- This repository contains the implementation of DPMLBench: Holistic Evaluation of Differentially Private Machine Learning☆10Updated last year
- Privacy-Preserving Gradient Boosting Decision Trees (AAAI 2020)☆27Updated last year
- Multiple Frequency Estimation Under Local Differential Privacy in Python☆47Updated last year
- Implementation of "PrivGraph: Differentially Private Graph Data Publication by Exploiting Community Information"☆13Updated 2 years ago