An implementation of Additive Attention
☆148Feb 15, 2022Updated 4 years ago
Alternatives and similar repositories for Fast-Transformer
Users that are interested in Fast-Transformer are comparing it to the libraries listed below
Sorting:
- An attempt at the implementation of GLOM, Geoffrey Hinton's paper for emergent part-whole hierarchies from data☆37Mar 27, 2021Updated 4 years ago
- Instant search for and access to many datasets in Pyspark.☆34Oct 6, 2022Updated 3 years ago
- An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches☆43Feb 12, 2022Updated 4 years ago
- A library of speech gadgets.☆14Oct 15, 2022Updated 3 years ago
- This repository hosts the code to port NumPy model weights of BiT-ResNets to TensorFlow SavedModel format.☆14Dec 21, 2021Updated 4 years ago
- Object-Region Video Transformers