craffel / comp790-deep-learning-spring-2022Links
Course repository for the Spring 2022 COMP790 course "Deep Learning" at UNC
☆19Updated 3 years ago
Alternatives and similar repositories for comp790-deep-learning-spring-2022
Users that are interested in comp790-deep-learning-spring-2022 are comparing it to the libraries listed below
Sorting:
- Course repository for the Spring 2023 COMP664 course "Deep Learning" at UNC☆14Updated 2 years ago
- This repository holds code and other relevant files for the NeurIPS 2022 tutorial: Foundational Robustness of Foundation Models.☆72Updated 3 years ago
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆81Updated 3 years ago
- ☆53Updated 2 years ago
- Minimum Description Length probing for neural network representations☆20Updated last year
- Recycling diverse models☆46Updated 3 years ago
- ML/DL Math and Method notes☆66Updated 2 years ago
- Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.☆44Updated 2 months ago
- ☆44Updated last year
- Few-shot Learning with Auxiliary Data☆31Updated 2 years ago
- A simple Python implementation of forward-forward NN training by G. Hinton from NeurIPS 2022☆21Updated 3 years ago
- ☆38Updated 2 years ago
- ☆27Updated last year
- This is the official implementation for our ACL 2024 paper: "Causal Estimation of Memorisation Profiles".☆24Updated 10 months ago
- Embroid: Unsupervised Prediction Smoothing Can Improve Few-Shot Classification☆11Updated 2 years ago
- Library implementing state-of-the-art Concept-based and Disentanglement Learning methods for Explainable AI☆55Updated 3 years ago
- Repo for ICML23 "Why do Nearest Neighbor Language Models Work?"☆59Updated 3 years ago
- Interactive Weak Supervision: Learning Useful Heuristics for Data Labeling☆30Updated 4 years ago
- Embedding Recycling for Language models☆38Updated 2 years ago
- An implementation of Transformer with Expire-Span, a circuit for learning which memories to retain☆34Updated 5 years ago
- [ACL 2021] IrEne: Interpretable Energy Prediction for Transformers☆11Updated 4 years ago
- Measuring if attention is explanation with ROAR☆22Updated 2 years ago
- We view Large Language Models as stochastic language layers in a network, where the learnable parameters are the natural language prompts…☆95Updated last year
- Google Research☆46Updated 3 years ago
- A diff tool for language models☆44Updated 2 years ago
- Adding new tasks to T0 without catastrophic forgetting☆33Updated 3 years ago
- Data Valuation on In-Context Examples (ACL23)☆24Updated last year
- [NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Lea…☆76Updated last year
- Implementation of the model: "Reka Core, Flash, and Edge: A Series of Powerful Multimodal Language Models" in PyTorch☆28Updated last week
- Revisiting Efficient Training Algorithms For Transformer-based Language Models (NeurIPS 2023)☆81Updated 2 years ago