M2Lschool / tutorials2025Links
Mediterranean Machine Learning school 2025 tutorials
☆43Updated 3 months ago
Alternatives and similar repositories for tutorials2025
Users that are interested in tutorials2025 are comparing it to the libraries listed below
Sorting:
- A template for starting reproducible Python machine-learning projects with hardware acceleration. Find an example at https://github.com/C…☆113Updated 6 months ago
- Reliable, minimal and scalable library for pretraining foundation and world models☆107Updated 3 weeks ago
- ViT Prisma is a mechanistic interpretability library for Vision and Video Transformers (ViTs).☆325Updated 4 months ago
- ☆69Updated 2 years ago
- List of ML conferences with important dates and accepted paper list☆186Updated last month
- Latent Program Network (from the "Searching Latent Program Spaces" paper)☆106Updated 2 weeks ago
- A modular, easy to extend GFlowNet library☆300Updated last week
- ☆231Updated 2 weeks ago
- ☆82Updated last year
- 🪄 Interpreto is an interpretability toolbox for LLMs☆71Updated last week
- The M2L school 2022 tutorials☆37Updated 3 years ago
- ⏰ AI conference deadline countdowns☆291Updated last week
- ☆67Updated 8 months ago
- Generative Flow Networks - GFlowNet☆297Updated this week
- Mediterranean Machine Learning school 2024 tutorials☆41Updated last year
- Official JAX implementation of xLSTM including fast and efficient training and inference code. 7B model available at https://huggingface.…☆105Updated 11 months ago
- ☆44Updated 2 years ago
- nanoGPT-like codebase for LLM training☆113Updated last month
- European Summer School on AI Course "Machines Climbing Pearl's Ladder of Causation"☆14Updated last year
- Access to free kaggle compute power from your command line☆32Updated last year
- Annotated version of the Mamba paper☆491Updated last year
- 🧠 Starter templates for doing interpretability research☆74Updated 2 years ago
- ☆285Updated last year
- ☆366Updated 3 months ago
- Research Project Template Repository☆37Updated 3 months ago
- ☆366Updated 3 months ago
- The boundary of neural network trainability is fractal☆221Updated last year
- Library for Jacobian descent with PyTorch. It enables the optimization of neural networks with multiple losses (e.g. multi-task learning)…☆287Updated this week
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆179Updated 5 months ago
- ☆62Updated 10 months ago