craffel / comp790-deep-learning-spring-2022Links
Course repository for the Spring 2022 COMP790 course "Deep Learning" at UNC
☆19Updated 3 years ago
Alternatives and similar repositories for comp790-deep-learning-spring-2022
Users that are interested in comp790-deep-learning-spring-2022 are comparing it to the libraries listed below
Sorting:
- Recycling diverse models☆46Updated 2 years ago
- This repository holds code and other relevant files for the NeurIPS 2022 tutorial: Foundational Robustness of Foundation Models.☆72Updated 2 years ago
- ☆13Updated last year
- [NeurIPS 2022] DataMUX: Data Multiplexing for Neural Networks☆60Updated 3 years ago
- Google Research☆46Updated 3 years ago
- A simple Python implementation of forward-forward NN training by G. Hinton from NeurIPS 2022☆21Updated 3 years ago
- ☆56Updated 2 years ago
- Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.☆44Updated last month
- A library to create and manage configuration files, especially for machine learning projects.☆79Updated 3 years ago
- ☆45Updated 5 years ago
- ☆52Updated last year
- An implementation of Transformer with Expire-Span, a circuit for learning which memories to retain☆34Updated 5 years ago
- Utilities for Training Very Large Models☆58Updated last year
- Library implementing state-of-the-art Concept-based and Disentanglement Learning methods for Explainable AI☆55Updated 3 years ago
- Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"☆70Updated 2 years ago
- ☆38Updated 2 years ago
- This is the implementation of the paper AdaMix: Mixture-of-Adaptations for Parameter-efficient Model Tuning (https://arxiv.org/abs/2205.1…☆136Updated 2 years ago
- Revisiting Efficient Training Algorithms For Transformer-based Language Models (NeurIPS 2023)☆81Updated 2 years ago
- Code for papers Linear Algebra with Transformers (TMLR) and What is my Math Transformer Doing? (AI for Maths Workshop, Neurips 2022)☆76Updated last year
- Adversarial examples to the new ConvNeXt architecture☆20Updated 3 years ago
- ☆19Updated 3 years ago
- Code for the paper "The Impact of Positional Encoding on Length Generalization in Transformers", NeurIPS 2023☆138Updated last year
- Code accompanying our paper "Feature Learning in Infinite-Width Neural Networks" (https://arxiv.org/abs/2011.14522)☆63Updated 4 years ago
- Sparse and discrete interpretability tool for neural networks☆64Updated last year
- Official code release for the paper Coder Reviewer Reranking for Code Generation.☆45Updated 2 years ago
- A case study of efficient training of large language models using commodity hardware.☆68Updated 3 years ago
- A diff tool for language models☆44Updated last year
- Lightning template for easy prototyping⚡️☆13Updated 3 years ago
- ☆44Updated last year
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆81Updated 3 years ago