leoandeol / conformal_railway_signal_detectionLinks
Confident Object Detection via Conformal Prediction and Conformal Risk Control: an Application to Railway Signaling
☆11Updated 2 years ago
Alternatives and similar repositories for conformal_railway_signal_detection
Users that are interested in conformal_railway_signal_detection are comparing it to the libraries listed below
Sorting:
- Open-source framework for uncertainty and deep learning models in PyTorch☆448Updated last month
- This repository contains a collection of surveys, datasets, papers, and codes, for predictive uncertainty estimation in deep learning mo…☆762Updated 3 months ago
- Lightweight, useful implementation of conformal prediction on real data.☆981Updated 2 weeks ago
- A Python toolbox for conformal prediction research on deep learning models, using PyTorch.☆436Updated 3 weeks ago
- ☆109Updated 4 years ago
- Out-of-distribution detection, robustness, and generalization resources. The repository contains a curated list of papers, tutorials, boo…☆960Updated 3 weeks ago
- Simple, compact, and hackable post-hoc deep OOD detection for already trained tensorflow or pytorch image classifiers.☆60Updated 2 months ago
- Lightning-UQ-Box: Uncertainty Quantification for Neural Networks with PyTorch and Lightning☆208Updated last week
- The net:cal calibration framework is a Python 3 library for measuring and mitigating miscalibration of uncertainty estimates, e.g., by a …☆368Updated last year
- A library for Bayesian neural network layers and uncertainty estimation in Deep Learning extending the core of PyTorch☆638Updated 7 months ago
- [ICML 2024] Official code for Uncertainty Estimation by Density Aware Evidential Deep Learning☆13Updated last year
- This repo contains a PyTorch implementation of the paper: "Evidential Deep Learning to Quantify Classification Uncertainty"☆502Updated last year
- 👽 Out-of-Distribution Detection with PyTorch☆330Updated last month
- Quantus is an eXplainable AI toolkit for responsible evaluation of neural network explanations☆631Updated 4 months ago
- Benchmarking Generalized Out-of-Distribution Detection☆1,015Updated 3 months ago
- Code for Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty☆145Updated 2 years ago
- 👋 Xplique is a Neural Networks Explainability Toolbox☆719Updated this week
- A list of (post-hoc) XAI for time series☆166Updated last year
- Large-scale uncertainty benchmark in deep learning.☆65Updated 6 months ago
- Code of the paper "Beyond calibration: estimating the grouping loss of modern neural networks" published in ICLR 2023.☆12Updated 2 years ago
- ☆17Updated 3 weeks ago
- 👋 Influenciae is a Tensorflow Toolbox for Influence Functions☆64Updated last year
- An eXplainable AI toolkit with Concept Relevance Propagation and Relevance Maximization☆138Updated last year
- Reliability diagrams visualize whether a classifier model needs calibration☆161Updated 3 years ago
- A Library for Uncertainty Quantification.