AnimaVR / NeuroSync_Trainer_LiteLinks
A multi GPU audio to face animation AI model trainer for your iPhone ARKit data.
☆37Updated 2 months ago
Alternatives and similar repositories for NeuroSync_Trainer_Lite
Users that are interested in NeuroSync_Trainer_Lite are comparing it to the libraries listed below
Sorting:
- The NeuroSync Player allows for real-time streaming of facial blendshapes into Unreal Engine 5 using LiveLink - enabling facial animation…☆121Updated 2 months ago
- NeuroSync Audio to face animation local inference helper code.☆70Updated 2 months ago
- Use the NVIDIA Audio2Face headless server and interact with it through a requests API. Generate animation sequences for Unreal Engine 5, …☆125Updated 3 months ago
- A collection of AI model endpoints you can run locally for a real-time audio2face system. Toy demonstration, not for production. Use this…☆23Updated 2 months ago
- A service to convert audio to facial blendshapes for lipsyncing and facial performances.☆108Updated last month
- ☆445Updated 3 months ago
- ☆168Updated 2 years ago
- Audio2Face Avatar with Riva SDK functionality☆74Updated 2 years ago
- An open solution for AI-powered photorealistic digital humans.☆130Updated 3 weeks ago
- 3D Avatar Lip Synchronization from speech (JALI based face-rigging)☆82Updated 3 years ago
- Emotionally responsive Virtual Metahuman CV with Real-Time User Facial Emotion Detection (Unreal Engine 5).☆47Updated 6 months ago
- CLI tool for recording or replaying Epic Games' live link face capture frames.☆81Updated last year
- Send mediapipe data to unreal engine with livelink.☆45Updated 2 years ago
- Tools to work with the Pose Camera app☆152Updated last year
- Integration for the OpenAI Api in Unreal Engine☆36Updated 8 months ago
- Create face depth frames from images using landmarks☆36Updated 8 months ago
- convert VMCProcotol to MOPProcotol☆57Updated 4 years ago
- XVERSE Character UE plugin (XCharacter-UEPlugin) a 3D digital human creation plugin for Unreal Engine 5, developed by XVERSE Technology I…☆40Updated 11 months ago
- Virtual Actor : Motion capture for your 3D avatar.☆94Updated 2 years ago
- A UE5 plugin for improving the Metahuman ARKit face tracking.☆94Updated last year
- ☆227Updated 2 years ago
- Phiz is a tool that allows you to perform facial motion capture from any device and location.☆133Updated 2 years ago
- ☆505Updated 2 years ago
- Using Mediapipe to create an OBJ of a face from a source image☆22Updated last year
- SAiD: Blendshape-based Audio-Driven Speech Animation with Diffusion☆116Updated last year
- NVIDIA ACE samples, workflows, and resources☆272Updated last month
- Drive your metahuman to speak within 1 second.☆7Updated 4 months ago
- Talk with AI-powered detailed 3D avatar. Use LLM, TTS, Unity, and lip sync to bring the character to life.☆125Updated 7 months ago
- This Unreal Engine sample project demonstrates how to bring Epic Games' MetaHuman digital characters to life using the Amazon Polly text-…☆49Updated 2 years ago
- Official implementation for the SIGGRAPH Asia 2024 paper SPARK: Self-supervised Personalized Real-time Monocular Face Capture☆379Updated last month