bhdai / kagglelinkLinks
a way to SSH into Kaggle!
☆64Updated this week
Alternatives and similar repositories for kagglelink
Users that are interested in kagglelink are comparing it to the libraries listed below
Sorting:
- Instructions for connecting SSH between Kaggle and Visual Studio Code☆54Updated last year
- First-principle implementations of groundbreaking AI algorithms using a wide range of deep learning frameworks, accompanied by supporting…☆174Updated last week
- Access Google Colab compute from your local VSCode☆333Updated 2 months ago
- 1st Place Solution for LLM - Detect AI Generated Text Kaggle Competition☆197Updated last year
- Building a 2.3M-parameter LLM from scratch with LLaMA 1 architecture.☆180Updated last year
- Distributed training (multi-node) of a Transformer model☆72Updated last year
- From scratch implementation of a vision language model in pure PyTorch☆227Updated last year
- LORA: Low-Rank Adaptation of Large Language Models implemented using PyTorch☆112Updated last year
- ☆125Updated 9 months ago
- Notes about "Attention is all you need" video (https://www.youtube.com/watch?v=bCz4OMemCcA)☆293Updated 2 years ago
- Research projects built on top of Transformers☆62Updated 4 months ago
- Notes on the Mamba and the S4 model (Mamba: Linear-Time Sequence Modeling with Selective State Spaces)☆169Updated last year
- Coding a Multimodal (Vision) Language Model from scratch in PyTorch with full explanation: https://www.youtube.com/watch?v=vAmKB7iPkWw☆504Updated 7 months ago
- Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling☆197Updated 3 months ago
- LoRA and DoRA from Scratch Implementations☆206Updated last year
- Contains the public resources of Hands on GenAI book☆173Updated 6 months ago
- Conference schedule, top papers, and analysis of the data for NeurIPS 2023!☆119Updated last year
- PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"☆173Updated 3 months ago
- LLaMA 3 is one of the most promising open-source model after Mistral, we will recreate it's architecture in a simpler manner.☆171Updated 10 months ago
- my attempts at implementing various bits of Sepp Hochreiter's new xLSTM architecture☆130Updated last year
- A Great Collection of Deep Learning Tutorials and Repositories☆285Updated this week
- Variations of Kolmogorov-Arnold Networks☆115Updated last year
- Collection of resources for finetuning Large Language Models (LLMs).☆91Updated 6 months ago
- AIO Research Agent - an all-in-one intelligent companion for navigating the academic world.☆37Updated last year
- Hands-on tutorials on fine-tuning various LLMs using different fine-tuning techniques☆242Updated 3 weeks ago
- Naively combining transformers and Kolmogorov-Arnold Networks to learn and experiment☆35Updated 11 months ago
- Build high-performance AI models with modular building blocks☆533Updated this week
- Repo designed to help learn the Hugging Face ecosystem (transformers, datasets, accelerate + more).☆82Updated 2 weeks ago
- Survey: A collection of AWESOME papers and resources on the latest research in Mixture of Experts.☆125Updated 10 months ago
- Composition of Multimodal Language Models From Scratch☆15Updated 11 months ago