SpookyCorgi / phiz
Phiz is a tool that allows you to perform facial motion capture from any device and location.
☆126Updated last year
Alternatives and similar repositories for phiz:
Users that are interested in phiz are comparing it to the libraries listed below
- Web-first SDK that provides real-time ARKit-compatible 52 blend shapes from a camera feed, video or image at 60 FPS using ML models.☆84Updated 2 years ago
- mediapipe landmark to mixamo skeleton☆33Updated last year
- ☆158Updated 2 years ago
- ☆108Updated last year
- Blender add-on to implement VOCA neural network.☆58Updated 2 years ago
- CLI tool for recording or replaying Epic Games' live link face capture frames.☆79Updated last year
- 3D Avatar Lip Synchronization from speech (JALI based face-rigging)☆78Updated 2 years ago
- addon for blender to import mocap data from tools like easymocap, frankmocap and Vibe☆108Updated 3 years ago
- Virtual Actor : Motion capture for your 3D avatar.☆93Updated 2 years ago
- openFACS : an open source FACS-based 3D face animation system☆146Updated 3 years ago
- ☆94Updated 3 years ago
- Send mediapipe data to unreal engine with livelink.☆41Updated last year
- Tools to work with the Pose Camera app☆147Updated 8 months ago
- ☆137Updated last year
- AI based all-in-one character generator Blender plug-in. This project contains unofficial updates for the CEB_ECON Blender add-on.☆88Updated last year
- A project where motion capture data is created based on the AI solution MediaPipe Holistic and applied to a 3D character in Blender☆70Updated 3 years ago
- Free Face Tracking Module for facial motion capture in Blender☆49Updated 2 years ago
- 3D face model that can generate high-quality mesh and texture☆277Updated 7 months ago
- convert VMCProcotol to MOPProcotol☆58Updated 3 years ago
- Unreal Engine Livelink App + Apple ARkit Blendshapes = Animoji Clone☆39Updated 2 years ago
- A UE5 plugin for improving the Metahuman ARKit face tracking.☆91Updated last year
- An Blender addon uses ROMP to extract human's 3D poses from image, video or webcam and drive your own 3D character.☆253Updated last year
- A python tool with facial landmark annotation and coefficient finder☆309Updated 3 years ago
- Public repository for the CVPR 2020 paper AvatarMe and the TPAMI 2021 AvatarMe++☆170Updated last year
- ☆196Updated 3 years ago
- implementation based on "Audio-Driven Facial Animation by Joint End-to-End Learning of Pose and Emotion"☆162Updated 4 years ago
- Headbox tool to do facial animation on the Microsoft Rocketbox☆47Updated 2 years ago
- Single-view real-time motion capture built up upon Google Mediapipe.☆210Updated last year
- OSC Live Link for Unreal and Hallway☆25Updated 2 years ago
- This is the source code of our 3DRW 2019 paper☆80Updated 2 years ago