AnimaVR / NeuroSync_Trainer_Lite
A multi GPU audio2face blendshape AI model trainer for your iPhone ARKit data.
☆22Updated this week
Alternatives and similar repositories for NeuroSync_Trainer_Lite:
Users that are interested in NeuroSync_Trainer_Lite are comparing it to the libraries listed below
- The NeuroSync Player allows for real-time streaming of facial blendshapes into Unreal Engine 5 using LiveLink - enabling facial animation…☆81Updated last week
- Audio to face local inference helper code.☆50Updated last month
- A collection of AI model endpoints you can run locally for a real-time audio2face system. Toy demonstration, not for production. Use this…☆14Updated 3 weeks ago
- Use the NVIDIA Audio2Face headless server and interact with it through a requests API. Generate animation sequences for Unreal Engine 5, …☆109Updated this week
- A service to convert audio to facial blendshapes for lipsyncing and facial performances.☆77Updated 3 weeks ago
- A UE5 plugin for improving the Metahuman ARKit face tracking.☆93Updated last year
- Emotionally responsive Virtual Metahuman CV with Real-Time User Facial Emotion Detection (Unreal Engine 5).☆45Updated 3 months ago
- convert VMCProcotol to MOPProcotol☆58Updated 3 years ago
- An open solution for AI-powered photorealistic digital humans.☆122Updated last year
- Oculus LipSync Plugin compiled for Unreal Engine 5. This plugin allows you to synchronize the lips of 3D characters in your game with aud…☆94Updated 2 years ago
- Motion Capture runtime for UE4☆79Updated 3 years ago
- Audio2Face Avatar with Riva SDK functionality☆73Updated 2 years ago
- Send mediapipe data to unreal engine with livelink.☆44Updated 2 years ago
- This Unreal Engine sample project demonstrates how to bring Epic Games' MetaHuman digital characters to life using the Amazon Polly text-…☆50Updated 2 years ago
- 3D Avatar Lip Synchronization from speech (JALI based face-rigging)☆81Updated 3 years ago
- Create face depth frames from images using landmarks☆35Updated 5 months ago
- Web interface to convert text to speech and route it to an Audio2Face streaming player.☆33Updated last year
- LiveLink Source for receiving JSON over sockets.☆102Updated 5 years ago
- OSC Live Link for Unreal and Hallway☆25Updated 2 years ago
- Oculus Lip Sync Compiled for Unreal 5☆42Updated last year
- Pose Asset with visemes for Epic's MetaHuman face skeleton☆51Updated 2 years ago
- UE5 MediaPipe free plugin motion capture and facial☆12Updated 2 years ago
- Talk with AI-powered detailed 3D avatar. Use LLM, TTS, Unity, and lip sync to bring the character to life.☆83Updated 4 months ago
- UE4 plugin and sample project to receive data from MocapForAll☆20Updated 3 years ago
- Pytorch reimplementation of audio driven face mesh or blendshape models, including Audio2Mesh, VOCA, etc☆14Updated 8 months ago
- Blender Addon that generates ARKit blendshapes for facial motion capture☆39Updated last year
- ☆163Updated 2 years ago
- Mediapipe library plugin for unreal engine 5☆30Updated 2 years ago
- CLI tool for recording or replaying Epic Games' live link face capture frames.☆80Updated last year
- OWL VTuber Studio for Live-Streaming☆32Updated last year