usnistgov / Differential-Privacy-Synthetic-Data-Challenge-assets
This repository contains all public data, python scripts, and documentation relating to NIST Public Safety Communications Research Division's Differential Privacy program including past prize challenges and bechmark problem sets.
☆11Updated 2 years ago
Alternatives and similar repositories for Differential-Privacy-Synthetic-Data-Challenge-assets:
Users that are interested in Differential-Privacy-Synthetic-Data-Challenge-assets are comparing it to the libraries listed below
- ☆14Updated 2 months ago
- Code for Canonne-Kamath-Steinke paper https://arxiv.org/abs/2004.00010☆62Updated 4 years ago
- Implementation of calibration bounds for differential privacy in the shuffle model☆22Updated 4 years ago
- Analytic calibration for differential privacy with Gaussian perturbations☆47Updated 6 years ago
- An implementation of "Data Synthesis via Differentially Private Markov Random Fields"☆13Updated last year
- ☆64Updated 5 years ago
- ☆28Updated 2 years ago
- This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning.☆26Updated 4 years ago
- Heterogeneous Gaussian Mechanism: Preserving Differential Privacy in Deep Learning with Provable Robustness (IJCAI'19).☆13Updated 4 years ago
- ☆43Updated 3 years ago
- A fast algorithm to optimally compose privacy guarantees of differentially private (DP) mechanisms to arbitrary accuracy.☆73Updated last year
- Implementations of differentially private release mechanisms for graph statistics☆23Updated 2 years ago
- ☆32Updated last year
- This project's goal is to evaluate the privacy leakage of differentially private machine learning models.☆131Updated 2 years ago
- Secure Aggregation for FL☆34Updated last year
- ☆38Updated 2 years ago
- Code for Exploiting Unintended Feature Leakage in Collaborative Learning (in Oakland 2019)☆53Updated 5 years ago
- A secure aggregation system for private federated learning☆39Updated 11 months ago
- Privacy-preserving Federated Learning with Trusted Execution Environments☆67Updated 2 years ago
- Eluding Secure Aggregation in Federated Learning via Model Inconsistency☆12Updated 2 years ago
- A simple Python implementation of a secure aggregation protocole for federated learning.☆35Updated last year
- Differential Privacy Testing System☆22Updated 5 years ago
- A machine-learning-based tool for discovering differential privacy violations in black-box algorithms.☆25Updated 2 years ago
- This is an implementation for paper "A Hybrid Approach to Privacy Preserving Federated Learning" (https://arxiv.org/pdf/1812.03224.pdf)☆21Updated 4 years ago
- A sybil-resilient distributed learning protocol.☆103Updated last year
- Unified, Simplified, Tight and Fast Privacy Amplification in the Shuffle Model of Differential Privacy☆11Updated 6 months ago
- [NeurIPS 2021] "G-PATE: Scalable Differentially Private Data Generator via Private Aggregation of Teacher Discriminators" by Yunhui Long*…☆30Updated 3 years ago
- Code for computing tight guarantees for differential privacy☆23Updated 2 years ago
- Statistical Counterexample Detector for Differential Privacy☆28Updated last year
- Code to reproduce experiments in "Antipodes of Label Differential Privacy PATE and ALIBI"☆31Updated 3 years ago