AnimaVR / NeuroSync_Local_API
Audio to face local inference helper code.
☆12Updated 3 weeks ago
Alternatives and similar repositories for NeuroSync_Local_API:
Users that are interested in NeuroSync_Local_API are comparing it to the libraries listed below
- The NeuroSync Player allows for real-time streaming of facial blendshapes into Unreal Engine 5 using LiveLink - enabling facial animation…☆18Updated 3 weeks ago
- Use the NVIDIA Audio2Face headless server and interact with it through a requests API. Generate animation sequences for Unreal Engine 5, …☆89Updated 6 months ago
- XVERSE Character UE plugin (XCharacter-UEPlugin) a 3D digital human creation plugin for Unreal Engine 5, developed by XVERSE Technology I…☆32Updated 4 months ago
- ☆159Updated 2 years ago
- CLI tool for recording or replaying Epic Games' live link face capture frames.☆79Updated last year
- Virtual Actor : Motion capture for your 3D avatar.☆92Updated 2 years ago
- convert VMCProcotol to MOPProcotol☆58Updated 3 years ago
- Create face depth frames from images using landmarks☆30Updated 2 months ago
- Motion Capture runtime for UE4☆78Updated 3 years ago
- Send mediapipe data to unreal engine with livelink.☆38Updated last year
- ☆393Updated 7 months ago
- mediapipe for ue library export☆25Updated 2 years ago
- Oculus LipSync Plugin compiled for Unreal Engine 5. This plugin allows you to synchronize the lips of 3D characters in your game with aud…☆85Updated last year
- convert bvh motion file to UE animation in runtime.☆31Updated last year
- An Unreal Engine 5 plugin for Hand Tracking using OpenXR extensions, and nothing more :)☆23Updated 6 months ago
- LiveLink Source for receiving JSON over sockets.☆103Updated 5 years ago
- creates live link app blendshape data formated in csv from video, for facial motion capture☆155Updated 11 months ago
- UE4 MediaPipe plugin☆297Updated 2 years ago
- A UE5 plugin for improving the Metahuman ARKit face tracking.☆90Updated last year
- motion_generate_tools is a Blender addon for generate motion using MDM: Human Motion Diffusion Model.☆107Updated 2 years ago
- A python script to modify the metahuman rig into a new one for Maya.☆72Updated 2 months ago
- Oculus Lip Sync Compiled for Unreal 5☆38Updated 9 months ago
- addon for blender to import mocap data from tools like easymocap, frankmocap and Vibe☆108Updated 3 years ago
- A python tool with facial landmark annotation and coefficient finder☆310Updated 3 years ago
- Emotionally responsive Virtual Metahuman CV with Real-Time User Facial Emotion Detection (Unreal Engine 5).☆45Updated 2 weeks ago
- SAiD: Blendshape-based Audio-Driven Speech Animation with Diffusion☆98Updated last year
- Phiz is a tool that allows you to perform facial motion capture from any device and location.☆124Updated last year
- ☆181Updated 9 months ago
- ☆47Updated 2 years ago
- Demo project for NNEngine☆10Updated 3 months ago