cschell / who-is-alyxLinks
This dataset contains over 110 hours of motion, eye-tracking and physiological data from 71 players of the virtual reality game “Half-Life: Alyx”. Each player played the game on two separate days for about 45 minutes using a HTC Vive Pro.
☆16Updated 8 months ago
Alternatives and similar repositories for who-is-alyx
Users that are interested in who-is-alyx are comparing it to the libraries listed below
Sorting:
- Unique identification of 50,000+ virtual reality users from their head and hand motion data☆13Updated 2 years ago
- ☆39Updated last month
- The public repository for "The Psychology of 360-Video"☆18Updated 3 years ago
- Robust and automatic MoCap data recovery (Matlab code).☆29Updated 7 years ago
- Code for IMUPoser: Full-Body Pose Estimation using IMUs in Phones, Watches, and Earbuds☆141Updated 8 months ago
- MoveBox is a toolbox to animate the Microsoft Rocketbox avatars using motion captured (MoCap). Motion capture is performed using a single…☆88Updated last year
- Tobii Eye Tracking for HTC VIVE Eye Pro in Unity☆78Updated last year
- ☆43Updated 9 months ago
- Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions (CHI '22 Honorable Mention)☆32Updated last year
- Unity Trial Manager for 2D/3D/VR/AR Behavioral Experiment☆16Updated last year
- A software package of Unity to use HTC VIVE Pro Eye, a head-mounted display with VR technology, for assessment of saccadic eye movement.☆46Updated last year
- PLUME is a software toolbox that allows for the exhaustive record of XR behavioral data (including synchronous physiological signals), th…☆46Updated 5 months ago
- XDTK is an open-source toolkit for building interactions between Android devices and a Unity application.☆35Updated 6 months ago
- The Virtual Reality Scientific Toolkit facilitates the creation and execution of experiments in VR environments by making object tracking…☆19Updated 2 years ago
- ☆11Updated 7 years ago
- RagRug☆23Updated last year
- Code for our ACMM'20 paper "Kalman Filter-based Head Motion Prediction for Cloud-based Mixed Reality"☆23Updated 4 years ago
- Deep Inertial Poser: Learning to Reconstruct Human Pose from Sparse Inertial Measurements in Real Time☆162Updated 2 years ago
- The Biomotion Lab SMPL Unity Player (bmlSUP) is a tool to play SMPL-H model animations in Unity Game Engine☆47Updated 3 years ago
- SIM2VR - Running Biomechanical User Simulations in Unity☆17Updated 9 months ago
- Official Code for ECCV 2022 paper "AvatarPoser: Articulated Full-Body Pose Tracking from Sparse Motion Sensing"☆315Updated 6 months ago
- A real-time system that captures physically correct human motion, joint torques, and ground reaction forces with only 6 inertial measurem…☆337Updated 3 months ago
- ☆17Updated last week
- Code for GROOT: A Real-Time Streaming System of High-Fidelity Volumetric Videos☆12Updated 2 years ago
- Towards Social Artificial Intelligence: Nonverbal Social Signal Prediction in A Triadic Interaction (CVPR 2020)☆33Updated 4 years ago
- A real-time motion capture system that estimates poses and global translations using only 6 inertial measurement units☆428Updated 3 months ago
- A dataset of egocentric vision, eye-tracking and full body kinematics from human locomotion in out-of-the-lab environments. Also, differe…☆11Updated last year
- We present VRception, a toolkit to quickly and easily prototype Cross-Reality Systems without the need for hardware prototyping. The fund…☆32Updated 3 years ago
- Improving Deep Learning for HAR with shallow LSTMs (best paper award), Bock et al., presented at ISWC 2021 (online)☆59Updated 5 months ago
- Point cloud / volumetric video player for Unity☆14Updated 2 years ago