CMU-IDeeL / new_grad
A new lightweight auto-differentation library that directly builds on numpy. Used as a homework for CMU 11785/11685/11485.
☆36Updated 3 years ago
Alternatives and similar repositories for new_grad:
Users that are interested in new_grad are comparing it to the libraries listed below
- List of AI Residency & Research programs, Ph.D Fellowships, Research Internships☆158Updated 4 years ago
- 11-785 Introduction to Deep Learning (IDeeL) website with logistics and select course materials☆41Updated this week
- Example of a Cover letter for AI Residency☆80Updated 5 years ago
- AI residency programs information☆463Updated 2 years ago
- An easy-to-use app to visualise attentions of various VQA models.☆41Updated 2 years ago
- This repository contains an overview of important follow-up works based on the original Vision Transformer (ViT) by Google.☆168Updated 3 years ago
- Course webpage for COMP 790, (Deep) Learning from Limited Labeled Data☆304Updated 4 years ago
- Configuration classes enabling type-safe PyTorch configuration for Hydra apps☆213Updated 2 years ago
- Research boilerplate for PyTorch.☆149Updated last year
- Implementation of Feedback Transformer in Pytorch☆105Updated 4 years ago
- Probing the representations of Vision Transformers.☆324Updated 2 years ago
- A compilation of research advice.☆219Updated 4 years ago
- Pytorch implementation of Compressive Transformers, from Deepmind☆157Updated 3 years ago
- My repo for training neural nets using pytorch-lightning and hydra☆219Updated last month
- Official codebase for Pretrained Transformers as Universal Computation Engines.☆246Updated 3 years ago
- Original transformer paper: Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information process…☆236Updated 11 months ago
- ML and DL related contests, competitions and conference challenges.☆604Updated last year
- This is a collection of the code that accompanies the reports in The Gallery by Weights & Biases.☆336Updated 3 years ago
- Code for our paper: *Shamsian, *Kleinfeld, Globerson & Chechik, "Learning Object Permanence from Video"☆68Updated 5 months ago
- ☆14Updated 2 years ago
- Implementation of popular SOTA self-supervised learning algorithms as Fastai Callbacks.☆320Updated 2 years ago
- Interview Questions and Answers for Machine Learning Engineer role☆119Updated 2 years ago
- A library to inspect and extract intermediate layers of PyTorch models.☆472Updated 2 years ago
- A collection of code snippets for my PyTorch Lightning projects☆107Updated 4 years ago
- Is the attention layer even necessary? (https://arxiv.org/abs/2105.02723)☆484Updated 3 years ago
- ☆129Updated 3 years ago
- Resources I used for ML Engineer, Applied Scientist and Quant Researcher interviews.☆306Updated 3 years ago
- ☆62Updated 3 years ago
- Implementation of Slot Attention from GoogleAI☆425Updated 8 months ago
- Flexible components pairing 🤗 Transformers with Pytorch Lightning☆609Updated 2 years ago