nesl / nist_differential_privacy_synthetic_data_challengeLinks
UCLANesl - NIST Differential Privacy Challenge (Match 3)
☆25Updated 6 years ago
Alternatives and similar repositories for nist_differential_privacy_synthetic_data_challenge
Users that are interested in nist_differential_privacy_synthetic_data_challenge are comparing it to the libraries listed below
Sorting:
- A toolbox for differentially private data generation☆131Updated 2 years ago
- This repository contains the codes for first large-scale investigation of Differentially Private Convex Optimization algorithms.☆63Updated 6 years ago
- autodp: A flexible and easy-to-use package for differential privacy☆276Updated last year
- Source code of paper "Differentially Private Generative Adversarial Network"☆70Updated 6 years ago
- SAP Security research sample code and tutorials for generating differentially private synthetic datasets using generative deep learning m…☆24Updated last year
- ☆40Updated 2 years ago
- Differentially private release of semantic rich data☆35Updated 4 years ago
- Privacy Testing for Deep Learning☆210Updated last month
- A library for running membership inference attacks against ML models☆150Updated 2 years ago
- A Simulator for Privacy Preserving Federated Learning☆96Updated 4 years ago
- Tools and service for differentially private processing of tabular and relational data☆280Updated 2 months ago
- ☆38Updated 3 years ago
- The Python Differential Privacy Library. Built on top of: https://github.com/google/differential-privacy☆535Updated last year
- Differentially Private Generative Adversarial Networks for Time Series, Continuous, and Discrete Open Data☆33Updated 6 years ago
- Code for "Differential Privacy Has Disparate Impact on Model Accuracy" NeurIPS'19☆33Updated 4 years ago
- A library providing general-purpose tools for estimating discrete distributions from noisy observations of their marginals.☆106Updated 3 weeks ago
- Privacy Meter: An open-source library to audit data privacy in statistical and machine learning algorithms.☆678Updated 5 months ago
- ☆24Updated last year
- This project's goal is to evaluate the privacy leakage of differentially private machine learning models.☆135Updated 2 years ago
- ☆16Updated 5 years ago
- Implementation of membership inference and model inversion attacks, extracting training data information from an ML model. Benchmarking …☆103Updated 5 years ago
- Python language bindings for smartnoise-core.☆76Updated 2 years ago
- Python package to create adversarial agents for membership inference attacks againts machine learning models☆46Updated 6 years ago
- ☆36Updated 2 years ago
- Analytic calibration for differential privacy with Gaussian perturbations☆50Updated 7 years ago
- A concise primer on Differential Privacy☆29Updated 5 years ago
- Differentially-private Wasserstein GAN implementation in PyTorch☆28Updated 5 years ago
- Pytorch implementation of paper Semi-supervised Knowledge Transfer for Deep Learning from Private Training Data (https://arxiv.org/abs/16…☆45Updated 3 years ago
- Code for Canonne-Kamath-Steinke paper https://arxiv.org/abs/2004.00010☆61Updated 5 years ago
- ☆37Updated 3 years ago