spro / sconce-python
☆16Updated 7 years ago
Alternatives and similar repositories for sconce-python:
Users that are interested in sconce-python are comparing it to the libraries listed below
- Various experiments on the [Fashion-MNIST](https://github.com/zalandoresearch/fashion-mnist) dataset from Zalando☆31Updated 7 years ago
- Pytorch implementation of bytenet from "Neural Machine Translation in Linear Time" paper☆46Updated 7 years ago
- Implement Natural Language Object Retrieval in tensorflow☆35Updated 8 years ago
- Gated Recurrent Unit with Low-rank matrix factorization☆34Updated 9 years ago
- The training code for the EMNLP 2017 paper "Learning Generic Sentence Representations Using Convolutional Neural Networks"☆34Updated 7 years ago
- An aspiring attempt to generate a continuous space of sentences with DenseNet☆26Updated 7 years ago
- Code for "Aggregated Momentum: Stability Through Passive Damping", Lucas et al. 2018☆34Updated 6 years ago
- RWA in pytorch☆14Updated 7 years ago
- Deep generative model for sentiment analysis☆34Updated 8 years ago
- ☆49Updated 6 years ago
- Density Order Embeddings☆33Updated 5 years ago
- Slides/code for the Lisbon machine learning school 2017☆28Updated 7 years ago
- Tensorflow Implementation of Multi-Function Recurrent Unit☆23Updated 8 years ago
- Implementing FastSent in theano☆12Updated 8 years ago
- ☆19Updated 6 years ago
- This is a Tensorflow implementation of the End-to-End Memory Network applied to the Ubuntu Dialog Corpus. The model can be compared to a …☆11Updated 8 years ago
- This contains my M.Tech project work on using Deep Leanring for learning graph representations. Data will be provided on request☆33Updated 7 years ago
- Visual Question Answering system's different implementations☆10Updated 8 years ago
- Training scripts for paper Miceli Barone et al. 2017 "Deep Architectures for Neural Machine Translation"☆11Updated 7 years ago
- Variational autoencoder in Theano☆12Updated 7 years ago
- Towards cross-lingual distributed representations without parallel text trained with adversarial autoencoders