joannetruong / habitat-lab
A modular high-level library to train embodied AI agents across a variety of tasks, environments, and simulators.
☆24Updated 10 months ago
Alternatives and similar repositories for habitat-lab
Users that are interested in habitat-lab are comparing it to the libraries listed below
Sorting:
- ☆14Updated 6 months ago
- QuadWBG: Generalizable Quadrupedal Whole-Body Grasping☆21Updated 6 months ago
- ☆78Updated 3 months ago
- Learned Perceptive Forward Dynamics Model☆41Updated 2 weeks ago
- terrain-robustness benchmark for legged locomotion☆54Updated 2 years ago
- Papers in Mobile Manipulation (Personal Collection)☆30Updated last year
- [IROS 2024] LEEPS : Learning End-to-End Legged Perceptive Parkour Skills on Challenging Terrains☆57Updated 10 months ago
- Train guide dog controller and force estimator in Isaac Gym and validate in PyBullet☆21Updated last year
- ☆86Updated 3 weeks ago
- [CVPR 2022] Codebase for "Coupling Vision and Proprioception for Navigation of Legged Robots"☆108Updated 2 years ago
- Project code.☆23Updated 3 months ago
- Vision-Language Navigation Benchmark in Isaac Lab☆163Updated last month
- Official implementation of "FALCON: Learning Force-Adaptive Humanoid Loco-Manipulation"☆54Updated this week
- Code and benchmark release for paper: GenTe: Generative Real-world Terrains for General Legged Robot Locomotion Control☆19Updated last month
- ☆12Updated 8 months ago
- GAMMA: Graspability-Aware Mobile MAnipulation Policy Learning based on Online Grasping Pose Fusion☆73Updated 7 months ago
- The ROS package that runs on unitree machine and publish all interface to ROS network☆14Updated last year
- Collect some related resources of NVIDIA Isaac Sim☆54Updated 3 weeks ago
- ☆28Updated 5 months ago
- Official Implementation for "Catch It! Learning to Catch in Flight with Mobile Dexterous Hands"☆75Updated 3 months ago
- Quadruped Tasks extension based on Isaac Lab.☆35Updated last week
- Official Code for "Adversarial Locomotion and Motion Imitation for Humanoid Policy Learning"☆53Updated last week
- ☆44Updated last year
- ☆85Updated 7 months ago
- ViNL: Visual Navigation and Locomotion over Obstacles (ICRA 2023)☆111Updated last year
- Code to train a compliant whole-body controller for the Unitree B1 + Z1☆80Updated 7 months ago
- ☆96Updated last week
- ☆97Updated 6 months ago
- Reproduction code of paper "World Model-based Perception for Visual Legged Locomotion"☆116Updated 4 months ago
- ☆20Updated 7 months ago