EleutherAI / pile-pubmedcentralLinks
A script for collecting the PubMed Central dataset in a language modelling friendly format.
☆24Updated 4 years ago
Alternatives and similar repositories for pile-pubmedcentral
Users that are interested in pile-pubmedcentral are comparing it to the libraries listed below
Sorting:
- Download, parse, and filter data PubMed, data-ready for The-Pile☆23Updated 3 years ago
- Dataset and evaluation suite enabling LLM instruction-following for scientific literature understanding.☆42Updated 6 months ago
- Embedding Recycling for Language models☆39Updated 2 years ago
- ☆28Updated 6 months ago
- Transformers at any scale☆41Updated last year
- Google Research☆46Updated 2 years ago
- Adding new tasks to T0 without catastrophic forgetting☆33Updated 2 years ago
- Few-shot Learning with Auxiliary Data☆31Updated last year
- This is the official PyTorch repo for "UNIREX: A Unified Learning Framework for Language Model Rationale Extraction" (ICML 2022).☆26Updated 2 years ago
- ☆49Updated 3 years ago
- Medical reasoning using large language models☆90Updated last year
- Implementation of the model: "Reka Core, Flash, and Edge: A Series of Powerful Multimodal Language Models" in PyTorch☆29Updated this week
- Parkar and Kim et al.'s paper on Can LLMs Select Important Instructions to Annotate?"☆12Updated last year
- [NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Lea…☆75Updated last year
- Official implementation of the ACL 2024: Scientific Inspiration Machines Optimized for Novelty☆85Updated last year
- Minimum Description Length probing for neural network representations☆18Updated 7 months ago
- Prompt-Guided Retrieval For Non-Knowledge-Intensive Tasks☆12Updated 2 years ago
- This repository contains some of the code used in the paper "Training Language Models with Langauge Feedback at Scale"☆27Updated 2 years ago
- Pretraining Efficiently on S2ORC!☆169Updated 11 months ago
- The open source implementation of "Connecting Large Language Models with Evolutionary Algorithms Yields Powerful Prompt Optimizers"☆19Updated last year
- [ICML 2023] Exploring the Benefits of Training Expert Language Models over Instruction Tuning☆99Updated 2 years ago
- Code for our EMNLP '22 paper "Fixing Model Bugs with Natural Language Patches"☆19Updated 2 years ago
- Finding semantically meaningful and accurate prompts.☆48Updated last year
- In-BoXBART: Get Instructions into Biomedical Multi-task Learning☆14Updated 3 years ago
- Code for "Seeking Neural Nuggets: Knowledge Transfer in Large Language Models from a Parametric Perspective"☆33Updated last year
- Ranking of fine-tuned HF models as base models.☆36Updated last week
- Code for paper 'Data-Efficient FineTuning'☆28Updated 2 years ago
- ☆50Updated 2 years ago
- Pre-trained Language Model for Scientific Text☆46Updated last year
- ☆79Updated last year