CyberZHG / keras-transformer
Transformer implemented in Keras
☆370Updated 2 years ago
Related projects: ⓘ
- A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need☆702Updated 2 years ago
- Keras library for building (Universal) Transformers, facilitating BERT and GPT models☆533Updated 4 years ago
- A wrapper layer for stacking layers horizontally☆227Updated 2 years ago
- Attention mechanism for processing sequential data that considers the context for each timestamp.☆654Updated 2 years ago
- Keras Layer implementation of Attention for Sequential models☆443Updated last year
- Keras implementation of BERT with pre-trained weights☆814Updated 5 years ago
- A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.☆802Updated last year
- Implementation of XLNet that can load pretrained checkpoints☆172Updated 2 years ago
- Re-implementation of ELMo on Keras☆135Updated last year
- LSTM-CRF in PyTorch☆457Updated last month
- ALBERT model Pretraining and Fine Tuning using TF2.0☆199Updated last year
- Multi-class metrics for Tensorflow☆224Updated 2 years ago
- Transformer-based models implemented in tensorflow 2.x(using keras).☆75Updated 2 years ago
- 自注意力与文本分类☆119Updated 5 years ago
- TensorFlow implementation of 'Attention Is All You Need (2017. 6)'☆348Updated 6 years ago
- BiLSTM-CNN-CRF architecture for sequence tagging using ELMo representations.☆388Updated last year
- An example attention network with simple dataset.☆230Updated 5 years ago
- Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch☆689Updated 3 years ago
- Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with Ten…☆341Updated 7 months ago
- Hierarchically-Refined Label Attention Network for Sequence Labeling☆285Updated 3 years ago
- How to use ELMo embeddings in Keras with Tensorflow Hub☆260Updated 5 years ago
- Multilabel classification for Toxic comments challenge using Bert☆311Updated 5 years ago
- keras example of seq2seq, auto title☆332Updated 4 years ago
- A simple technique to integrate BERT from tf hub to keras☆258Updated last year
- Visualizing RNNs using the attention mechanism☆747Updated 5 years ago
- Hierarchical Attention Networks for Document Classification in PyTorch☆603Updated 4 years ago
- ☆531Updated 5 years ago
- Implementation of papers for text classification task on DBpedia☆738Updated 3 years ago
- Implementation of BERT that could load official pre-trained models for feature extraction and prediction☆2,429Updated 2 years ago
- 1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.☆331Updated last month