ZexinChen / AlphaTracker
AlphaTracker is a computer vision pipeline with the practical and real-time advantages , which requires minimal hardware requirements and produces reliable tracking of multiple unmarked animals. An easy-to-use user interface further enables manual inspection and curation of results.
☆69Updated last year
Alternatives and similar repositories for AlphaTracker:
Users that are interested in AlphaTracker are comparing it to the libraries listed below
- An active learning platform for expert-guided, data efficient discovery of behavior.☆58Updated this week
- Demo code for CAPTURE and DANNCE☆28Updated 3 years ago
- End-user version of the Mouse Action Recognition System (MARS)☆62Updated last year
- Closed-loop behavioral experiment toolkit using pose estimation of body parts.☆55Updated last year
- A gui for browsing joint calcium imaging + behavioral data.☆51Updated 3 months ago
- Code package that allows meta analyses of unsupervised behavior results☆16Updated 8 months ago
- DLC2Action is an action segmentation package that makes running and tracking of machine learning experiments easy.☆25Updated last year
- A tool for drawing ROIs on videos and analysing deeplabcut videos.☆33Updated 4 years ago
- a module for kinematic analysis of deeplabcut outputs☆139Updated last year
- ☆31Updated 4 years ago
- Free, platform independent, behavior tracking software.☆126Updated 9 months ago
- Various scripts to support deeplabcut and what to do afterwards!☆154Updated last year
- ☆87Updated last week
- PyRat is a user friendly library in python to analyze data from the DeepLabCut. Developed to help researchers unfamiliar with programming…☆47Updated last year
- a Python API to record video & system timestamps from Imaging Source USB cameras☆67Updated 2 years ago
- DeepLabCut based data analysis package including pose estimation and representation learning mediated behavior recognition☆44Updated this week
- Repository for BehaviorDEPOT pipeline app files and source code☆37Updated 6 months ago
- MoSeq2 Jupyter Notebook platform used to run all of the MoSeq2 tools in a GUI.☆34Updated 3 months ago
- ☆104Updated 3 months ago
- [ECCV 2024] "Elucidating the Hierarchical Nature of Behavior with Masked Autoencoders"☆20Updated last month
- For developers and people training MARS models☆18Updated last year
- R Scripts to analyze deep lab cut point data and related data☆66Updated 2 years ago
- Labeling GUI for multi-camera tracking☆37Updated 7 months ago
- Variational Animal Motion Embedding - A tool for time series embedding and clustering☆184Updated 6 months ago
- Configurable acquisition from multiple FLIR/Point Grey camera using Bonsai and Python to h264 compress the video on the GPU.☆13Updated last month
- SIPEC: the deep-learning Swiss knife for behavioral data analysis☆45Updated 7 months ago
- GUI to run DeepLabCut on live video feed☆61Updated 6 months ago
- Modified Python 3.0 implementation of MotionMapper (https://github.com/gordonberman/MotionMapper)☆29Updated 10 months ago
- ☆32Updated 5 years ago
- ☆51Updated 7 months ago