graphcore / distributed-kge-poplar
The application is a end-user training and evaluation system for standard knowledge graph embedding models. It was developed to optimise the WikiKG90Mv2 dataset
☆15Updated 9 months ago
Alternatives and similar repositories for distributed-kge-poplar:
Users that are interested in distributed-kge-poplar are comparing it to the libraries listed below
- ☆13Updated 6 months ago
- Exploration using DSPy to optimize modules to maximize performance on the OpenToM dataset☆14Updated 10 months ago
- Embroid: Unsupervised Prediction Smoothing Can Improve Few-Shot Classification☆11Updated last year
- ☆14Updated last year
- Code for our paper Resources and Evaluations for Multi-Distribution Dense Information Retrieval☆14Updated last year
- PyTorch Implementation of the paper "MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training"☆23Updated this week
- Implementation of SelfExtend from the paper "LLM Maybe LongLM: Self-Extend LLM Context Window Without Tuning" from Pytorch and Zeta☆13Updated 2 months ago
- Minimum Description Length probing for neural network representations☆18Updated last week
- Evaluation of neuro-symbolic engines☆34Updated 5 months ago
- Official Implementation of "CheckEmbed: Effective Verification of LLM Solutions to Open-Ended Tasks"☆17Updated last month
- Generating and validating natural-language explanations.☆46Updated last week
- Repository for "GIST: Distributed training for large-scale graph convolutional networks"☆14Updated 2 years ago
- Cross-field empirical trends analysis of XAI literature☆21Updated last year
- Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.☆38Updated 8 months ago
- ☆11Updated 5 months ago
- ☆14Updated 7 months ago
- ☆9Updated last year
- Latent Large Language Models☆17Updated 4 months ago
- Code for paper: "Privately generating tabular data using language models".☆14Updated last year
- ☆12Updated 2 years ago
- ☆18Updated 8 months ago
- Official code for paper: Conservative objective models are a special kind of contrastive divergence-based energy model☆14Updated last year
- The official evaluation suite and dynamic data release for MixEval.☆10Updated 3 months ago
- Entailment self-training☆25Updated last year
- ☆17Updated 2 years ago
- official repo of AAAI2024 paper Mitigating the Impact of False Negatives in Dense Retrieval with Contrastive Confidence Regularization☆13Updated last year
- Implementation of Spectral State Space Models☆16Updated 10 months ago
- Tasks and tutorials using Graphore's IPU with Hugging Face. Originally at https://github.com/gradient-ai/Graphcore-HuggingFace☆12Updated 10 months ago
- Plug-and-play Search Interfaces with Pyserini and Hugging Face☆32Updated last year
- Q-Probe: A Lightweight Approach to Reward Maximization for Language Models☆40Updated 7 months ago