aditya-malte / Microsoft.AI.Challenge.Machine-Reading-Comprehension-using-BERTLinks
The code used fine-tuning of BERT(Transformer Neural Network Architecture)to accurately pick the correct answer among ten choices that best answers the question
☆12Updated 6 years ago
Alternatives and similar repositories for Microsoft.AI.Challenge.Machine-Reading-Comprehension-using-BERT
Users that are interested in Microsoft.AI.Challenge.Machine-Reading-Comprehension-using-BERT are comparing it to the libraries listed below
Sorting:
- ☆47Updated 5 years ago
- Comparing Text Classification results using BERT embedding and ULMFIT embedding☆65Updated 6 years ago
- machine reading comprehension with deep learning☆20Updated 7 years ago
- Official code of our work, Robust, Transferable Sentence Representations for Text Classification [Arxiv 2018].☆22Updated 7 years ago
- ☆35Updated 4 years ago
- Facilitate the learning, practicing, and designing of neural text matching models with a user-friendly and interactive interface.☆41Updated 3 years ago
- Bidirectional Attention Flow for Machine Comprehension implemented in Keras 2☆64Updated 3 years ago
- RNN-free TensorFlow model for reading comprehension on SQuAD☆33Updated 3 years ago
- This repository uses pretrain BERT embeddings for transfer learning in QA domain☆29Updated 7 years ago
- PyTorch implementations of various deep learning models for paraphrase detection, semantic similarity, and textual entailment☆107Updated 7 years ago
- BERT which stands for Bidirectional Encoder Representations from Transformations is the SOTA in Transfer Learning in NLP.☆56Updated 5 years ago
- CapsNet for NLP☆66Updated 7 years ago
- A module for E-mail Summarization which uses clustering of skip-thought sentence embeddings.