i-am-shreya / Eye-Gaze-SurveyView external linksLinks
[TPAMI] Automatic Gaze Analysis ‘in-the-wild’: A Survey
☆120May 29, 2024Updated last year
Alternatives and similar repositories for Eye-Gaze-Survey
Users that are interested in Eye-Gaze-Survey are comparing it to the libraries listed below
Sorting:
- Updated code for paper "Revisiting Data Normalization for Appearance-Based Gaze Estimation"☆19Aug 17, 2025Updated 6 months ago
- Awesome Curated List of Eye Gaze Estimation Paper☆522Jun 14, 2025Updated 8 months ago
- ☆45Sep 18, 2021Updated 4 years ago
- ☆44Nov 27, 2021Updated 4 years ago
- Official implementation of ETH-XGaze dataset baseline☆223Aug 9, 2024Updated last year
- An unofficial PyTorch implementation of MPIIGaze and MPIIFaceGaze☆374Mar 15, 2024Updated last year
- Pytorch implementation and demo of FAZE: Few-Shot Adaptive Gaze Estimation (ICCV 2019, oral)☆348Jun 23, 2024Updated last year
- ☆48Aug 17, 2021Updated 4 years ago
- This repository contains all links of my work on gaze estimation. All updates will be shown in this page.☆24May 18, 2022Updated 3 years ago
- Gaze estimation using MPIIGaze and MPIIFaceGaze☆359Jun 29, 2024Updated last year
- Gaze decomposition for appearance-based gaze estimation☆12Mar 15, 2020Updated 5 years ago
- Repository for GazeVisual performance evaluation software tools☆10Jul 30, 2019Updated 6 years ago
- Lightweight gaze estimation with PyTorch.☆98Dec 3, 2024Updated last year
- RT-GENE: Real-Time Eye Gaze and Blink Estimation in Natural Environments☆431Oct 10, 2024Updated last year
- The codes and models in 'Gaze Estimation using Transformer, ICPR2022'.☆142May 30, 2024Updated last year
- ☆56Dec 14, 2021Updated 4 years ago
- Code for the CVPRW GAZE 2021 paper -- GOO : A Dataset for Gaze Object Prediction in Retail Environments☆50Apr 23, 2024Updated last year
- Gaze estimatin code. The Pytorch Implementation of "It’s written all over your face: Full-face appearance-based gaze estimation".☆39Apr 28, 2021Updated 4 years ago
- PyTorch implementation of STAGE model☆17Mar 17, 2025Updated 11 months ago
- AR/VR Eye Semantic Segmentation - Rank 5th/17 - OpenEDS 2019☆55Jun 16, 2021Updated 4 years ago
- PFLD实现Gaze Estimation☆21Aug 13, 2021Updated 4 years ago
- Weakly-Supervised Physically Unconstrained Gaze Estimation☆26Mar 9, 2021Updated 4 years ago
- ☆47Apr 17, 2024Updated last year
- Demonstration of the winning model for Facebook OpenEDS semantic segmentation challenge, which achieves a highly accurate (95.3%) within …☆14Apr 25, 2020Updated 5 years ago
- ☆18Sep 22, 2022Updated 3 years ago
- A deep learning based gaze estimation framework implemented with PyTorch☆193Feb 26, 2020Updated 5 years ago
- Gaze estimatin code. The Pytorch Implementation of "MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation".☆91Apr 28, 2021Updated 4 years ago
- Code for the Gaze360: Physically Unconstrained Gaze Estimation in the Wild Dataset☆260Jan 27, 2022Updated 4 years ago
- Mixed Effects Neural Networks (MeNets) with Applications to Gaze Estimation☆29Jun 19, 2019Updated 6 years ago
- An implementation of Google's paper - "Accelerating eye movement research via accurate and affordable smartphone eye tracking" for GSoC 2…☆33Oct 5, 2022Updated 3 years ago
- Adaptive Feature Fusion Network for Gaze Tracking in Mobile Tablets.☆49Dec 13, 2021Updated 4 years ago
- Gaze estimatin code. The Pytorch Implementation of "Gaze360: Physically Unconstrained Gaze Estimation in the Wild".☆33Sep 13, 2022Updated 3 years ago
- Gaze Estimation via Deep Neural Networks☆123Nov 20, 2020Updated 5 years ago
- Gaze estimatin code. The Pytorch implementation of "Appearance-Based Gaze Estimation Using Dilated-Convolutions".☆24Apr 28, 2021Updated 4 years ago
- Method for GAZE 2021 Competition on EVE dataset☆33Aug 2, 2021Updated 4 years ago
- Eye Tracking for Everyone☆1,037Jul 6, 2023Updated 2 years ago
- ☆16Apr 25, 2020Updated 5 years ago
- Mirror to https://es-git.cs.uni-tuebingen.de/santini/EyeRecToo☆41Aug 20, 2020Updated 5 years ago
- Towards End-to-end Video-based Eye-tracking. ECCV 2020. https://ait.ethz.ch/eve☆128Jun 26, 2023Updated 2 years ago