tukuyo / AirPodsPro-Motion-SamplerLinks
This project is a sample to get the motion sensor values of the AirPods Pro(1st or 2nd gen) / AirPods Max / AirPods(3rd gen) or Beats Fit Pro.
☆310Updated 2 years ago
Alternatives and similar repositories for AirPodsPro-Motion-Sampler
Users that are interested in AirPodsPro-Motion-Sampler are comparing it to the libraries listed below
Sorting:
- A minimal iOS AR app that can be used as a template when creating an AR app for the first time.☆119Updated 2 years ago
- Test Swift's AirPods Motion API in this sample project☆86Updated 3 years ago
- A compact visionOS project that offers a deep dive into the platform's core functionalities☆202Updated 2 years ago
- iOS app, which demonstrates how to use headphones motion data.☆33Updated 4 years ago
- Beautiful visionOS UI designs based on the Apple Human Interface Guidelines and animation cheat sheet and inspirations for your next proj…☆241Updated last year
- This is an ios framework for eye tracking on iPhone using ARKit☆93Updated 2 years ago
- Match hand gesture for Apple Vision Pro, Test Hand Tracking on simulator☆195Updated 2 months ago
- A sheet shows the evolution of Apple frameworks - Metal, ARKit, and RealityKit.☆168Updated 3 weeks ago
- Explore visionOS: Collection of Code Samples, Tutorials and Valuable Resources☆134Updated last year
- HumanScan for WWDC 21 Swift Student Challenge☆64Updated 4 years ago
- Examples of how to build things in VisionOS☆233Updated last year
- a sample collection of basic functions of Apple's AR framework for iOS.☆175Updated 3 years ago
- Metal Shader Language(MSL) examples.☆180Updated 4 years ago
- EyeTracking is a Swift Package that makes it easy to use ARKit's eye and facial tracking data, designed for use in Educational Technology…☆102Updated 5 years ago
- A demo project to help everyone in learning VisionOS development and by creating an Inspiration4 Fan App☆115Updated 2 years ago
- A swift package making it easy to implement body tracking in ARKit and RealityKit.☆86Updated last year
- Quickly add MediaPipe Pose Estimation and Detection to your iOS app. Enable powerful features in your app powered by the body or hand.☆297Updated 2 months ago
- ARKit in visionOS Examples☆127Updated last year
- Render 3d models with SwiftUI effortlessly☆262Updated last year
- an iOS app that generate images using Stable Diffusion and displays them in AR.☆103Updated 2 years ago
- Eye Tracking with ARKit : SwiftUI iOS App☆34Updated 2 years ago
- A launch screen made with SwiftUI and RealityKit. Used in the Find app.☆244Updated 3 years ago
- visionOS examples ⸺ Spatial Computing Accelerators for Apple Vision Pro☆388Updated 7 months ago
- visionOS 2 30 days challenge.☆183Updated 7 months ago
- Bringing the scanning box from SceneKit to RealityKit☆400Updated 5 months ago
- Real hands for the VisionOS Simulator☆63Updated last year
- An example spatial/immersive MV-HEVC video player for Apple Vision Pro☆272Updated last year
- Scripts to convert 2D images and videos into Spatial versions☆107Updated last year
- A workshop for vision os development.☆151Updated last year
- Create, Save, and Export Point Clouds w/ Lidar equipped Iphones☆118Updated 2 years ago