tukuyo / AirPodsPro-Motion-SamplerLinks
This project is a sample to get the motion sensor values of the AirPods Pro(1st or 2nd gen) / AirPods Max / AirPods(3rd gen) or Beats Fit Pro.
☆313Updated 2 years ago
Alternatives and similar repositories for AirPodsPro-Motion-Sampler
Users that are interested in AirPodsPro-Motion-Sampler are comparing it to the libraries listed below
Sorting:
- Test Swift's AirPods Motion API in this sample project☆91Updated 4 years ago
- A minimal iOS AR app that can be used as a template when creating an AR app for the first time.☆120Updated 3 years ago
- a sample collection of basic functions of Apple's AR framework for iOS.☆176Updated 3 years ago
- A swift package making it easy to implement body tracking in ARKit and RealityKit.☆86Updated last year
- A compact visionOS project that offers a deep dive into the platform's core functionalities☆202Updated 2 years ago
- Explore visionOS: Collection of Code Samples, Tutorials and Valuable Resources☆134Updated last year
- HumanScan for WWDC 21 Swift Student Challenge☆64Updated 4 years ago
- Examples of how to build things in VisionOS☆233Updated 2 years ago
- A sheet shows the evolution of Apple frameworks - Metal, ARKit, and RealityKit.☆168Updated 2 months ago
- Match hand gesture for Apple Vision Pro, Test Hand Tracking on simulator☆195Updated 2 weeks ago
- Beautiful visionOS UI designs based on the Apple Human Interface Guidelines and animation cheat sheet and inspirations for your next proj…☆242Updated last year
- Real hands for the VisionOS Simulator☆63Updated 2 years ago
- Metal Shader Language(MSL) examples.☆182Updated 4 years ago
- This is an ios framework for eye tracking on iPhone using ARKit☆93Updated 2 years ago
- A demo project to help everyone in learning VisionOS development and by creating an Inspiration4 Fan App☆116Updated 2 years ago
- iOS app, which demonstrates how to use headphones motion data.☆33Updated 5 years ago
- A collection of Shader Graph Materials for visionOS.☆224Updated last year
- an iOS app that generate images using Stable Diffusion and displays them in AR.☆103Updated 2 years ago
- ARKit in visionOS Examples☆128Updated last year
- EyeTracking is a Swift Package that makes it easy to use ARKit's eye and facial tracking data, designed for use in Educational Technology…☆105Updated 5 years ago
- [WWDC 22 Swift Student Challenge Winner] Pegboard, a SwiftUI based node-editor workspace for game dev and more☆251Updated 3 years ago
- Bringing the scanning box from SceneKit to RealityKit☆400Updated 7 months ago
- A minimal example of rendering an immersive spatial experience with Metal, ARKit, and visionOS Compositing Services☆223Updated last year
- An example spatial/immersive MV-HEVC video player for Apple Vision Pro☆272Updated last year
- visionOS 2 30 days challenge.☆185Updated 8 months ago
- Render 3d models with SwiftUI effortlessly☆262Updated last year
- A launch screen made with SwiftUI and RealityKit. Used in the Find app.☆245Updated 3 years ago
- The ultimate companion to MusicKit.☆446Updated last week
- Tape Measure app by hand tracking for Apple Vision Pro☆117Updated 8 months ago
- Scripts to convert 2D images and videos into Spatial versions☆108Updated last year