bmitra-msft / Demos
A bag of miscellaneous demos!
☆13Updated 7 years ago
Related projects ⓘ
Alternatives and complementary repositories for Demos
- Context Aware Language Models☆28Updated 6 years ago
- In this project, we use skip-gram model to embed Wikipedia Concepts and Entities. The English version of Wikipedia contains more than fiv…☆56Updated 7 years ago
- Document context language models☆22Updated 9 years ago
- Code for EMNLP 2016 paper: Morphological Priors for Probabilistic Word Embeddings☆52Updated 7 years ago
- Python toolkit for ranking experiments on sentence/summary data☆25Updated last year
- GloVe Word Embedding model's implementation in theano☆36Updated 7 years ago
- ☆30Updated 6 years ago
- Code needed to reproduce "Modeling documents with Generative Adversarial Networks"☆39Updated 7 years ago
- A TensorFlow implementation of dependency-based word embeddings (dependency-based word2vec)☆12Updated 8 years ago
- Non-distributional linguistic word vector representations.☆62Updated 7 years ago
- Python evaluation scripts for AIDA-formatted CoNLL data☆20Updated 10 years ago
- Entity Linking in Queries: Tasks and Evaluation☆34Updated last year
- ☆46Updated 6 years ago
- ☆27Updated 7 years ago
- A toolkit for generating paraphrase vector representations for words in context☆24Updated 9 years ago
- Transition-based joint syntactic dependency parser and semantic role labeler using a stack LSTM RNN architecture.☆60Updated 7 years ago
- ☆18Updated 8 years ago
- A repository for Neural Document Ranking Models.☆85Updated 6 years ago
- An implementation of Mikolov's word2vec in Python using Theano and Lasagne.☆37Updated 7 years ago
- An autoencoder to calculate word embeddings as mentioned in Lebret/Collobert paper 2015☆74Updated 7 years ago
- ☆23Updated 8 years ago
- Slides/code for the Lisbon machine learning school 2017☆28Updated 7 years ago
- SIGIR 2017: Embedding-based query expansion for weighted sequential dependence retrieval model☆37Updated 7 years ago
- ☆21Updated 6 years ago
- Modify word2vec such that it's possible to "condition" on existing embeddings for some words, and induce embeddings for new words.☆40Updated 8 years ago
- Context Encoders (ConEc) as a simple but powerful extension of the word2vec model for learning word embeddings☆20Updated 4 years ago
- Word vectors☆64Updated 6 years ago