noanabeshima / wikipedia-downloaderLinks
Downloads 2020 English Wikipedia articles as plaintext
☆25Updated 2 years ago
Alternatives and similar repositories for wikipedia-downloader
Users that are interested in wikipedia-downloader are comparing it to the libraries listed below
Sorting:
- ☆90Updated 3 years ago
- Python tools for processing the stackexchange data dumps into a text dataset for Language Models☆83Updated last year
- The data processing pipeline for the Koala chatbot language model☆118Updated 2 years ago
- Pre-training code for CrystalCoder 7B LLM☆55Updated last year
- Script for downloading GitHub.☆96Updated last year
- A library for squeakily cleaning and filtering language datasets.☆47Updated 2 years ago
- ☆79Updated last year
- Official repo for NAACL 2024 Findings paper "LeTI: Learning to Generate from Textual Interactions."☆64Updated 2 years ago
- ☆39Updated 2 years ago
- URL downloader supporting checkpointing and continuous checksumming.☆19Updated last year
- This is a new metric that can be used to evaluate faithfulness of text generated by LLMs. The work behind this repository can be found he…☆31Updated last year
- Repository for analysis and experiments in the BigCode project.☆121Updated last year
- DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.☆168Updated 3 weeks ago
- Deploy your HPC Cluster on AWS in 20min. with just 1-Click.☆55Updated 4 months ago
- An Implementation of "Orca: Progressive Learning from Complex Explanation Traces of GPT-4"☆43Updated 9 months ago
- A GPT-based generative LM for combined text and math formulas, leveraging tree-based formula encoding.☆40Updated 2 years ago
- Safety Score for Pre-Trained Language Models☆95Updated last year
- ☆33Updated 2 years ago
- YT_subtitles - extracts subtitles from YouTube videos to raw text for Language Model training☆43Updated 4 years ago
- Zeus LLM Trainer is a rewrite of Stanford Alpaca aiming to be the trainer for all Large Language Models☆70Updated last year
- [ICML 2023] "Outline, Then Details: Syntactically Guided Coarse-To-Fine Code Generation", Wenqing Zheng, S P Sharan, Ajay Kumar Jaiswal, …☆40Updated last year
- ☆44Updated last year
- Small and Efficient Mathematical Reasoning LLMs☆71Updated last year
- Open Implementations of LLM Analyses☆105Updated 10 months ago
- ☆15Updated 4 months ago
- Google TPU optimizations for transformers models☆118Updated 6 months ago
- ☆85Updated last year
- ☆172Updated 5 months ago
- Demonstration that finetuning RoPE model on larger sequences than the pre-trained model adapts the model context limit☆63Updated 2 years ago
- Download, parse, and filter data from Court Listener, part of the FreeLaw projects. Data-ready for The-Pile.☆12Updated 2 years ago