chenjshnn / LabelDroid
Unblind Your Apps: Predicting Natural-Language Labels for Mobile GUI Components by Deep Learning
☆46Updated last year
Related projects ⓘ
Alternatives and complementary repositories for LabelDroid
- Screen2Vec is a new self-supervised technique for generating more comprehensive semantic embeddings of GUI screens and components using t…☆64Updated last year
- Conv Net for identifying GUI componenets from screenshots using Tensorflow☆12Updated last year
- Seq2act: Mapping Natural Language Instructions to Mobile UI Action Sequences from Google research☆12Updated 4 years ago
- Object Detection for Graphical User Interface: Old Fashioned or Deep Learning or a Combination?☆121Updated 9 months ago
- VINS: Visual Search for Mobile User Interface Design☆31Updated 3 years ago
- The Themis Benchmark for evaluating automated GUI testing☆143Updated 10 months ago
- ☆10Updated 2 years ago
- Explore Android apps like human.☆120Updated last year
- ☆11Updated 6 months ago
- ☆36Updated 5 years ago
- This repository contains the opensource version of the datasets were used for different parts of training and testing of models that grou…☆29Updated 4 years ago
- Owl Eyes: Spotting UI Display Issues via Visual Understanding☆11Updated 4 years ago
- The dataset includes UI object type labels (e.g., BUTTON, IMAGE, CHECKBOX) that describes the semantic type of an UI object on Android ap…☆45Updated 2 years ago
- ☆11Updated last year
- Data-driven Accessibility Repair Revisited: On the Effectiveness of Generating Labels for Icons in Android Apps☆10Updated 3 years ago
- ☆54Updated 3 years ago
- Mobile App Analysis and Testing Literature☆70Updated last month
- Black-box tool that uses Deep Reinforcement Learning to test and explore Android applications