nreHieW / lossLinks
Visualising Losses in Deep Neural Networks
☆16Updated last year
Alternatives and similar repositories for loss
Users that are interested in loss are comparing it to the libraries listed below
Sorting:
- Exploring an idea where one forgets about efficiency and carries out attention across each edge of the nodes (tokens)☆55Updated 9 months ago
- ☆24Updated last year
- Implementation of a holodeck, written in Pytorch☆18Updated 2 years ago
- Pixel Parsing. A reproduction of OCR-free end-to-end document understanding models with open data☆23Updated last year
- A dashboard for exploring timm learning rate schedulers☆19Updated last year
- Load any clip model with a standardized interface☆22Updated 2 months ago
- Utilities for Training Very Large Models☆58Updated last year
- PyTorch Implementation of the paper "MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training"☆24Updated last week
- Explorations into adversarial losses on top of autoregressive loss for language modeling☆40Updated this week
- Engineering the state of RNN language models (Mamba, RWKV, etc.)☆32Updated last year
- Utilities for PyTorch distributed☆25Updated 10 months ago
- Implementation of a Light Recurrent Unit in Pytorch☆49Updated last year
- Experimental scripts for researching data adaptive learning rate scheduling.☆22Updated 2 years ago
- Collection of autoregressive model implementation☆85Updated 8 months ago
- Some personal experiments around routing tokens to different autoregressive attention, akin to mixture-of-experts☆121Updated last year
- Exploration into the proposed "Self Reasoning Tokens" by Felipe Bonetto☆57Updated last year
- This repository hosts the code to port NumPy model weights of BiT-ResNets to TensorFlow SavedModel format.☆14Updated 4 years ago
- Local Attention - Flax module for Jax