xerial / dp-readingsLinks
Readings in Differential Privacy
☆18Updated 2 years ago
Alternatives and similar repositories for dp-readings
Users that are interested in dp-readings are comparing it to the libraries listed below
Sorting:
- A library providing general-purpose tools for estimating discrete distributions from noisy observations of their marginals.☆110Updated this week
- ☆32Updated 3 years ago
- autodp: A flexible and easy-to-use package for differential privacy☆278Updated 2 years ago
- The core library of differential privacy algorithms powering the OpenDP Project.☆407Updated this week
- ☆42Updated 3 years ago
- ☆38Updated 3 years ago
- Statistical Counterexample Detector for Differential Privacy☆28Updated last year
- A fast algorithm to optimally compose privacy guarantees of differentially private (DP) mechanisms to arbitrary accuracy.☆76Updated last year
- Diffprivlib: The IBM Differential Privacy Library☆903Updated 4 months ago
- Federated gradient boosted decision tree learning☆68Updated 2 years ago
- Fast, memory-efficient, scalable optimization of deep learning with differential privacy☆139Updated 3 weeks ago
- The Python Differential Privacy Library. Built on top of: https://github.com/google/differential-privacy☆541Updated 3 months ago
- Differentially Private (tabular) Generative Models Papers with Code☆55Updated last year
- Code for Canonne-Kamath-Steinke paper https://arxiv.org/abs/2004.00010☆63Updated 5 years ago
- Algorithms for Privacy-Preserving Machine Learning in JAX☆148Updated last week
- An implementation of "Data Synthesis via Differentially Private Markov Random Fields"☆15Updated last year
- Privacy-preserving XGBoost Inference☆50Updated 2 years ago
- A toolbox for differentially private data generation☆130Updated 2 years ago
- Research and experimental code related to Opacus, an open-source library for training PyTorch models with Differential Privacy☆18Updated last year
- ☆24Updated 2 years ago
- Code for computing tight guarantees for differential privacy☆23Updated 2 years ago
- This project's goal is to evaluate the privacy leakage of differentially private machine learning models.☆135Updated 3 years ago
- ☆80Updated 3 years ago
- ☆45Updated 4 years ago
- Analytic calibration for differential privacy with Gaussian perturbations☆51Updated 7 years ago
- A Simulator for Privacy Preserving Federated Learning☆96Updated 5 years ago
- Differential private machine learning☆200Updated 4 years ago
- ☆333Updated last month
- Private Evolution: Generating DP Synthetic Data without Training [ICLR 2024, ICML 2024 Spotlight]☆111Updated 3 months ago
- Tools and service for differentially private processing of tabular and relational data☆291Updated 6 months ago