DOUDOU0314 / GPT-J-hfLinks
GPT-jax based on the official huggingface library
β13Updated 4 years ago
Alternatives and similar repositories for GPT-J-hf
Users that are interested in GPT-J-hf are comparing it to the libraries listed below
Sorting:
- Tutorial to pretrain & fine-tune a π€ Flax T5 model on a TPUv3-8 with GCPβ58Updated 3 years ago
- Few Shot Learning using EleutherAI's GPT-Neo an Open-source version of GPT-3β18Updated 4 years ago
- TorchServe+Streamlit for easily serving your HuggingFace NER modelsβ33Updated 3 years ago
- β13Updated 11 months ago
- **ARCHIVED** Filesystem interface to π€ Hubβ58Updated 2 years ago
- Applying "Load What You Need: Smaller Versions of Multilingual BERT" to LaBSEβ19Updated 4 years ago
- β33Updated 2 years ago
- β16Updated 4 years ago
- Experiments with generating opensource language model assistantsβ97Updated 2 years ago
- β20Updated 4 years ago
- URL downloader supporting checkpointing and continuous checksumming.β19Updated 2 years ago
- Developing tools to automatically analyze datasetsβ75Updated last year
- Calculating Expected Time for training LLM.β38Updated 2 years ago
- Consists of the largest (10K) human annotated code-switched semantic parsing dataset & 170K generated utterance using the CST5 augmentatiβ¦β41Updated 2 years ago
- Training and evaluation code for the paper "Headless Language Models: Learning without Predicting with Contrastive Weight Tying" (https:/β¦β28Updated last year
- exBERT on Transformersπ€β10Updated 4 years ago
- Convenient Text-to-Text Training for Transformersβ19Updated 3 years ago
- π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.β81Updated 3 years ago
- Using short models to classify long textsβ21Updated 2 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.β96Updated 2 years ago
- A library for squeakily cleaning and filtering language datasets.β49Updated 2 years ago
- Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorchβ76Updated 3 years ago
- Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.β105Updated 3 years ago
- Megatron LM 11B on Huggingface Transformersβ27Updated 4 years ago
- All my experiments with the various transformers and various transformer frameworks availableβ14Updated 4 years ago
- official repo for AAAI ALOHA chatbotβ29Updated last year
- hllama is a library which aims to provide a set of utility tools for large language models.β10Updated last year
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AIβ56Updated 2 years ago
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+β37Updated 4 years ago
- Anh - LAION's multilingual assistant datasets and modelsβ27Updated 2 years ago