joannetruong / habitat-labLinks
A modular high-level library to train embodied AI agents across a variety of tasks, environments, and simulators.
☆24Updated last year
Alternatives and similar repositories for habitat-lab
Users that are interested in habitat-lab are comparing it to the libraries listed below
Sorting:
- Papers in Mobile Manipulation (Personal Collection)☆33Updated last year
- ☆17Updated 9 months ago
- QuadWBG: Generalizable Quadrupedal Whole-Body Grasping☆23Updated 9 months ago
- ☆95Updated 6 months ago
- ViNL: Visual Navigation and Locomotion over Obstacles (ICRA 2023)☆117Updated last year
- GeRM: A Generalist Robotic Model with Mixture-of-Experts for Quadruped Robot https://songwxuan.github.io/GeRM/☆17Updated 3 months ago
- [CVPR 2022] Codebase for "Coupling Vision and Proprioception for Navigation of Legged Robots"☆115Updated 3 years ago
- GAMMA: Graspability-Aware Mobile MAnipulation Policy Learning based on Online Grasping Pose Fusion☆83Updated 10 months ago
- ☆111Updated 2 weeks ago
- Official Implementation for "Catch It! Learning to Catch in Flight with Mobile Dexterous Hands"☆88Updated 5 months ago
- Code and benchmark release for paper: GenTe: Generative Real-world Terrains for General Legged Robot Locomotion Control☆25Updated 3 weeks ago
- Collect some related resources of NVIDIA Isaac Sim☆82Updated 3 weeks ago
- The ROS package that runs on unitree machine and publish all interface to ROS network☆16Updated last year
- ☆43Updated 2 months ago
- Vision-Language Navigation Benchmark in Isaac Lab☆216Updated 2 months ago
- terrain-robustness benchmark for legged locomotion☆61Updated 2 years ago
- Train guide dog controller and force estimator in Isaac Gym and validate in PyBullet☆21Updated last year
- Official codebase for PRELUDE (Perceptive Locomotion Under Dynamic Environments)☆71Updated 6 months ago
- [IROS 2024] LEEPS : Learning End-to-End Legged Perceptive Parkour Skills on Challenging Terrains☆65Updated last year
- ☆28Updated 3 weeks ago
- Awesome Manipulation☆28Updated this week
- Project Code for the paper "Learning Visual Locomotion with Cross-Modal Supervision" (ICRA2023)☆101Updated last year
- Active Perceptive Motion Generation for Mobile Manipulation☆36Updated last year
- ☆114Updated 3 months ago
- ☆37Updated 8 months ago
- ☆33Updated last month
- Learned Perceptive Forward Dynamics Model☆177Updated this week
- ☆53Updated 2 months ago
- Code for paper "Diff-Control: A stateful Diffusion-based Policy for Imitation Learning" (Liu et al., IROS 2024)☆65Updated 2 months ago
- ☆95Updated 2 weeks ago