nikhilno1 / nlp_projectsLinks
A collection of my NLP projects
☆19Updated 6 years ago
Alternatives and similar repositories for nlp_projects
Users that are interested in nlp_projects are comparing it to the libraries listed below
Sorting:
- Use fastai-v2 with HuggingFace's pretrained transformers☆110Updated 5 years ago
- Step wise instructions to integrate the power of BERT with Fastai☆18Updated 6 years ago
- Text Similarity Search Application using Modern NLP and Elasticsearch☆30Updated 5 years ago
- On Generating Extended Summaries of Long Documents☆78Updated 4 years ago
- ☆19Updated 5 years ago
- KitanaQA: Adversarial training and data augmentation for neural question-answering models☆56Updated 2 years ago
- This is the second part of the Deep Learning Course for the Master in High-Performance Computing (SISSA/ICTP).)☆33Updated 5 years ago
- NeuralQA: A Usable Library for Question Answering on Large Datasets with BERT☆234Updated 2 years ago
- Neural Search System on Arxiv AI/ML Papers☆54Updated 4 years ago
- Explainable Zero-Shot Topic Extraction☆63Updated last year
- Code to run the ExtRA algorithm for unsupervised topic/aspect extraction on English texts.☆54Updated 2 weeks ago
- ☆16Updated 4 years ago
- Dynamic ensemble decoding with transformer-based models☆29Updated 2 years ago
- Easy-to-use text representations extraction library based on the Transformers library.☆32Updated 3 years ago
- The ntentional blog - a machine learning journey☆23Updated 3 years ago
- Alternate Implementation for Zero Shot Text Classification: Instead of reframing NLI/XNLI, this reframes the text backbone of CLIP models…☆37Updated 3 years ago
- NLP Examples using the 🤗 libraries☆40Updated 4 years ago
- NLP tool to extract emotional phrase from tweets 🤩☆40Updated 4 years ago
- Deploy transformers serverless on AWS Lambda☆122Updated 4 years ago
- A monolingual and cross-lingual meta-embedding generation and evaluation framework☆80Updated 3 years ago
- Low-code pre-built pipelines for experiments with huggingface/transformers for Data Scientists in a rush.