SeanMWX / ArxivDay
☆19Updated 10 months ago
Alternatives and similar repositories for ArxivDay:
Users that are interested in ArxivDay are comparing it to the libraries listed below
- Sharing my research toolchain☆82Updated last year
- 在没有sudo权限的情况下,在linux上使用clash☆62Updated 3 months ago
- AutoDL平台服务器适配梯子, 使用 Clash 作为代理工具☆208Updated 3 months ago
- WWW2025 Multimodal Intent Recognition for Dialogue Systems Challenge☆114Updated 3 months ago
- 这是一个高效,快捷的arXiv论文爬虫,它可以将指定时间范围,指定主题,包含指定关键词的论文信息爬取到本地,并且将其中的标题和摘要翻译成中文。☆70Updated 5 months ago
- ☆21Updated 10 months ago
- [Paper][AAAI 2025] (MyGO)Tokenization, Fusion, and Augmentation: Towards Fine-grained Multi-modal Entity Representation☆236Updated 2 months ago
- The official implementation of Natural Language Fine-Tuning☆42Updated last month
- BYR Docs 资料存档☆30Updated this week
- 厦门大学选课系统选课程序,仅供学习交流使用,审慎运行程序,出任何问题最终责任权归运行者所有☆15Updated last year
- Clone Yourself by Fine-tuning a Large Language Model | 用大语言模型创造你的数字生命!☆19Updated last year
- ☆323Updated 2 weeks ago
- 浙江大学图灵班学长组资料汇总网站☆47Updated 5 months ago
- 中英双语 Typst 简历模板☆49Updated this week
- 本仓库是关于大模型面试中常见面试试题和面试经验的整理。这里收集了各类与大模型相关的面试题目,并提供详细的解答和分析。本仓库由上海交大交影社区维护☆72Updated 6 months ago
- A MkDocs plugin that create changelog in a page☆27Updated last year
- 让你的数字变成牢大!☆37Updated 5 months ago
- SCNU本科学位论文和幻灯片模板☆33Updated 9 months ago
- (更新于2024年) 华南理工大学 LaTeX 论文模板项目,star一下嘛~(☆▽☆),应该是最完善也是最容易使用的华工本科生论文模板了☆83Updated 8 months ago
- 2024年第十五届蓝桥杯Python A组省赛题目+参赛代码☆38Updated last month
- 一个由 Python 编写的轻量级、可扩展的QQ机器人前端☆68Updated last month
- A tiny paper rating web☆30Updated last week
- 《动手做科研》面向科研初学者,一步一步地展示如何入门人工智能科研☆298Updated 3 months ago
- ZJU毛概资料汇总☆8Updated 11 months ago
- CBU5201 Deception Dataset☆21Updated 2 months ago
- Overseas Summer Research Guidance 海外暑研申请指南☆255Updated 3 months ago
- What are learned in tiktoken?☆68Updated 9 months ago
- A vue-based project page template for academic papers. (in development) https://junyaohu.github.io/academic-project-page-template-vue☆228Updated last month
- A highly capable 2.4B lightweight LLM using only 1T pre-training data with all details.☆153Updated this week