urchade / GLiNER
Generalist and Lightweight Model for Named Entity Recognition (Extract any entity types from texts) @ NAACL 2024
☆1,978Updated this week
Alternatives and similar repositories for GLiNER:
Users that are interested in GLiNER are comparing it to the libraries listed below
- A lightweight, low-dependency, unified API to use all common reranking and cross-encoder models.☆1,400Updated last month
- Fast lexical search implementing BM25 in Python using Numpy, Numba and Scipy☆1,139Updated 2 weeks ago
- Retrieve, Read and LinK: Fast and Accurate Entity Linking and Relation Extraction on an Academic Budget (ACL 2024)☆417Updated 7 months ago
- Easily use and train state of the art late-interaction retrieval methods (ColBERT) in any RAG pipeline. Designed for modularity and ease-…☆3,426Updated 2 months ago
- Lite & Super-fast re-ranking for your search & retrieval pipelines. Supports SoTA Listwise and Pairwise reranking based on LLMs and cro…☆788Updated 5 months ago
- Distilabel is a framework for synthetic data and AI feedback for engineers who need fast, reliable and scalable pipelines based on verifi…☆2,671Updated last week
- Bringing BERT into modernity via both architecture changes and scaling☆1,342Updated last month
- LOTUS: A semantic query engine for fast and easy LLM-powered data processing☆1,169Updated this week
- Developer APIs to Accelerate LLM Projects☆1,642Updated 6 months ago
- 🦙 Integrating LLMs into structured NLP pipelines☆1,240Updated 3 months ago
- Efficient Retrieval Augmentation and Generation Framework☆1,531Updated 3 months ago
- Use late-interaction multi-modal models such as ColPali in just a few lines of code.☆776Updated 3 months ago
- The code used to train and run inference with the ColVision models, e.g. ColPali, ColQwen2, and ColSmol.☆1,787Updated this week
- Fast Semantic Text Deduplication & Filtering☆654Updated last week
- Recipes for shrinking, optimizing, customizing cutting edge vision models. 💜☆1,417Updated last month
- High-performance retrieval engine for unstructured data