JinkyuKimUCB / explainable-deep-driving
☆78Updated 3 years ago
Alternatives and similar repositories for explainable-deep-driving:
Users that are interested in explainable-deep-driving are comparing it to the libraries listed below
- Berkeley Deep Drive-X (eXplanation) dataset☆111Updated 6 years ago
- The official Talk2Car dataset repo☆76Updated 7 months ago
- [ECCV 2022] Learning to Drive by Watching YouTube Videos: Action-Conditioned Contrastive Policy Pretraining☆82Updated 2 years ago
- [ECCV 2024] Official GitHub repository for the paper "LingoQA: Visual Question Answering for Autonomous Driving"☆158Updated 5 months ago
- [AAAI 2024] NuScenes-QA: A Multi-modal Visual Question Answering Benchmark for Autonomous Driving Scenario.☆175Updated 4 months ago
- ☆29Updated last year
- Reimplementation (currently partial) of PRECOG paper, ICCV '19☆90Updated 2 years ago
- [AAAI2025] Language Prompt for Autonomous Driving☆131Updated 2 months ago
- ☆172Updated last year
- [ICCV 2021] Official PyTorch Implementation of "AgentFormer: Agent-Aware Transformers for Socio-Temporal Multi-Agent Forecasting".☆272Updated last year
- Reason2Drive: Towards Interpretable and Chain-based Reasoning for Autonomous Driving☆79Updated last year
- [CoRL'22] PlanT: Explainable Planning Transformers via Object-Level Representations☆247Updated 4 months ago
- [ECCV 2024] Embodied Understanding of Driving Scenarios☆179Updated 2 months ago
- [ICCV 2023 Oral] A New Paradigm for End-to-end Autonomous Driving to Alleviate Causal Confusion☆213Updated last year
- [CVPR 2024] LaMPilot: An Open Benchmark Dataset for Autonomous Driving with Language Model Programs☆29Updated 11 months ago
- [ECCV 2024] The official code for "Dolphins: Multimodal Language Model for Driving“