Sha-Lab / babywalkLinks
PyTorch code for the ACL 2020 paper: "BabyWalk: Going Farther in Vision-and-Language Navigationby Taking Baby Steps"
☆42Updated 3 years ago
Alternatives and similar repositories for babywalk
Users that are interested in babywalk are comparing it to the libraries listed below
Sorting:
- PyTorch code for ICLR 2019 paper: Self-Monitoring Navigation Agent via Auxiliary Progress Estimation☆122Updated last year
- PyTorch Code of NAACL 2019 paper "Learning to Navigate Unseen Environments: Back Translation with Environmental Dropout"☆132Updated 3 years ago
- Cooperative Vision-and-Dialog Navigation☆71Updated 2 years ago
- Code release for Fried et al., Speaker-Follower Models for Vision-and-Language Navigation. in NeurIPS, 2018.☆133Updated 2 years ago
- Code for "Tactical Rewind: Self-Correction via Backtracking in Vision-and-Language Navigation"☆61Updated 5 years ago
- large scale pretrain for navigation task☆92Updated 2 years ago
- Code for the paper "Improving Vision-and-Language Navigation with Image-Text Pairs from the Web" (ECCV 2020)☆56Updated 2 years ago
- Cornell Touchdown natural language navigation and spatial reasoning dataset.