teobabic / Simo
An iPhone app enabling hand, head, eye and body motion tracking.
☆23Updated last year
Related projects: ⓘ
- Unity AR Foundation demos with meshing functionality for iPhone 12 Pro and iPAD Pro LiDAR☆19Updated 3 years ago
- You can record the face tracking data with ARKit remote.☆17Updated 5 years ago
- This is the NASA Perseverance rover in an Augmented Reality experience hosted in the web. This was built in Unity with Photon Pun 2 for t…☆15Updated last year
- Examples of how to use a scriptable object to setup blendshape mapping for ARKit Blendshapes☆14Updated 3 years ago
- AR Food Eater Prototype with Face Tracking☆13Updated 4 years ago
- Simple experiment to capture Depth data from the iPad Pro's LiDAR☆26Updated 2 years ago
- Different demos with Unity MARS Body Tracking features☆13Updated 2 years ago
- This is a sample C# project that extracts Depth and Color information from videos shot in iPhone's Cinematic mode and outputs each as sep…☆36Updated 11 months ago
- Using Graphics Buffer input with Azure Kinect Point Cloud☆14Updated last year
- Unity3D Monocular depth estimation using Barracuda☆22Updated 2 years ago
- Demos with Unity MARS☆14Updated 4 years ago
- ARDK Lightship Demos created during YouTube video tutorials☆12Updated last year
- Few demos with Unity VisionOS 2D Window and Fully Immersive VR mode.☆17Updated 5 months ago
- iPhone/iPad -> Unity VFX Graph demo (pre-recorded 3D Video playback)☆33Updated 2 years ago
- Preconfigured project to show the demo scene of the Apple Vision UI Kit package in XR. Project is set up to build for Oculus Quest 2 / Qu…☆30Updated 6 months ago
- Unity plugin to obtain depth map from iPhone camera.☆50Updated 3 years ago
- ☆21Updated 4 years ago
- A MVP Project to apply all lessons learned through Learn XR VR Course.☆13Updated last year
- A basic Unity demo created to test Meta Depth API Features with URP (Universal Rendering Pipeline)☆10Updated 10 months ago
- Send sensor data from ARKit or ARCore to Grasshopper via wifi☆14Updated 3 years ago
- This proyect is a system body tracking using Machine Learning in Unity through Barracuda to move avatar 3d model☆15Updated last year
- A Unity demo app that illustrates full opacity for hands with background elimination.☆17Updated 7 years ago
- Unity plugin for real-time mesh reconstruction with an RGBD camera.☆27Updated 3 years ago
- Unity Oculus Quest VR Hand Tracking Template☆33Updated 4 years ago
- RenderStream plugin for Unity☆38Updated 10 months ago
- Visualizing a point cloud using scene depth in Unity similar to WWDC20 demo.☆64Updated 3 years ago
- Interaction and manipulation system for Oculus Quest based on Physics in Unity.☆12Updated 2 years ago
- ☆12Updated last year
- HoloLens With DlibFaceLandmarkDetector Example (Support for Hololens1 and Hololens2)☆29Updated last year
- A procedural generation demo with Augmented Reality☆17Updated 5 years ago