fel-thomas / Sobol-Attribution-MethodLinks
π Code for the paper: "Look at the Variance! Efficient Black-box Explanations with Sobol-based Sensitivity Analysis" (NeurIPS 2021)
β31Updated 3 years ago
Alternatives and similar repositories for Sobol-Attribution-Method
Users that are interested in Sobol-Attribution-Method are comparing it to the libraries listed below
Sorting:
- Conformal prediction for controlling monotonic risk functions. Simple accompanying PyTorch code for conformal risk control in computer viβ¦β72Updated 2 years ago
- Model-agnostic posthoc calibration without distributional assumptionsβ42Updated 2 years ago
- Wrapper for a PyTorch classifier which allows it to output prediction sets. The sets are theoretically guaranteed to contain the true claβ¦β252Updated 2 years ago
- Explores the ideas presented in Deep Ensembles: A Loss Landscape Perspective (https://arxiv.org/abs/1912.02757) by Stanislav Fort, Huiyi β¦β66Updated 5 years ago
- Learn then Test: Calibrating Predictive Algorithms to Achieve Risk Controlβ70Updated last year
- Build and train Lipschitz constrained networks: TensorFlow implementation of k-Lipschitz layersβ100Updated 8 months ago
- Last-layer Laplace approximation code examplesβ83Updated 4 years ago
- Bayesianize: A Bayesian neural network wrapper in pytorchβ89Updated last year
- This repository contains a Jax implementation of conformal training corresponding to the ICLR'22 paper "learning optimal conformal classiβ¦β130Updated 3 years ago
- Quantile risk minimizationβ24Updated last year
- Active and Sample-Efficient Model Evaluationβ25Updated 6 months ago
- General purpose library for BNNs, and implementation of OC-BNNs in our 2020 NeurIPS paper.β38Updated 3 years ago
- A repo for transfer learning with deep tabular modelsβ104Updated 2 years ago
- Training and evaluating NBM and SPAM for interpretable machine learning.β78Updated 2 years ago
- β37Updated 2 years ago
- Reliability diagrams visualize whether a classifier model needs calibrationβ161Updated 3 years ago
- Calibration library and code for the paper: Verified Uncertainty Calibration. Ananya Kumar, Percy Liang, Tengyu Ma. NeurIPS 2019 (Spotligβ¦β151Updated 3 years ago
- Official PyTorch implementation of "Meta-Calibration: Learning of Model Calibration Using Differentiable Expected Calibration Error"β36Updated 2 years ago
- Code for the paper "Calibrating Deep Neural Networks using Focal Loss"β161Updated last year
- Reusable BatchBALD implementationβ79Updated last year
- An amortized approach for calculating local Shapley value explanationsβ102Updated last year
- Supporing code for the paper "Bayesian Model Selection, the Marginal Likelihood, and Generalization".β37Updated 3 years ago
- Code and results accompanying our paper titled RLSbench: Domain Adaptation under Relaxed Label Shiftβ35Updated 2 years ago
- β32Updated 3 years ago
- This repository contains the code of the distribution shift framework presented in A Fine-Grained Analysis on Distribution Shift (Wiles eβ¦β84Updated 3 weeks ago
- Code for the paper "Getting a CLUE: A Method for Explaining Uncertainty Estimates"β36Updated last year
- Contains code for the NeurIPS 2020 paper by Pan et al., "Continual Deep Learning by FunctionalRegularisation of Memorable Past"β44Updated 5 years ago
- Code for experiments to learn uncertaintyβ30Updated 2 years ago
- β109Updated 3 years ago
- Self-Explaining Neural Networksβ43Updated 5 years ago