SCUT-AILab / Next-Generation-AI-awesomeLinks
An up-to-date list of progress made in next-generation AI.
☆11Updated 2 years ago
Alternatives and similar repositories for Next-Generation-AI-awesome
Users that are interested in Next-Generation-AI-awesome are comparing it to the libraries listed below
Sorting:
- This is the source code for Detecting Machine-Generated Texts by Multi-Population Aware Optimization for Maximum Mean Discrepancy (ICLR20…☆45Updated last year
- [TIP 2025] This is an official PyTorch implementation of "Zero-Shot Skeleton-Based Action Recognition With Prototype-Guided Feature Align…☆33Updated 6 months ago
- [NIPS 2025] Open-World Drone Active Tracking with Goal-Centered Rewards☆17Updated 3 months ago
- This is the source code for Detecting Adversarial Data by Probing Multiple Perturbations Using Expected Perturbation Score (ICML2023).☆40Updated last year
- AAAI2025☆11Updated 9 months ago
- Code for NeurIPS 2024 paper — Cross-Device Collaborative Test-Time Adaptation☆13Updated 11 months ago
- [IJCAI-2021] Contrastive Model Inversion for Data-Free Knowledge Distillation☆73Updated 3 years ago
- [ICLR 2025 Oral🔥] SD-LoRA: Scalable Decoupled Low-Rank Adaptation for Class Incremental Learning☆77Updated 7 months ago
- Efficient Dataset Distillation by Representative Matching☆113Updated last year
- [ICLR 2025] COME: Test-time Adaption by Conservatively Minimizing Entropy☆18Updated 11 months ago
- This repository will be posting analytic continual learning series, including Analytic Class-Incremental Learning (ACIL), Gaussian Kernel…☆280Updated last year
- [ICLR 2025] "Noisy Test-Time Adaptation in Vision-Language Models"☆17Updated 11 months ago
- Distilling Dataset into Generative Models☆54Updated 2 years ago
- (NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original …☆136Updated last year
- ☆14Updated 2 years ago
- [ICCV 2023] A Unified Continual Learning Framework with General Parameter-Efficient Tuning☆92Updated last year
- Multimodal Large Language Model (MLLM) Tuning Survey: Keeping Yourself is Important in Downstream Tuning Multimodal Large Language Model☆94Updated 6 months ago
- Model Predictive Task Sampling☆87Updated 3 months ago
- This repo will be continually updating analytic federated learning methods.☆69Updated 10 months ago
- Unofficial code for VPT(Visual Prompt Tuning) paper of arxiv 2203.12119☆164Updated 2 years ago
- [AAAI-2022] Up to 100x Faster Data-free Knowledge Distillation☆76Updated 3 years ago
- ☆149Updated last year
- Instruction Tuning in Continual Learning paradigm☆71Updated last year
- Code for NeurIPS 2022 paper “S-Prompts Learning with Pre-trained Transformers: An Occam’s Razor for Domain Incremental Learning“☆106Updated last year
- Official github repo for SafeDialBench, a comprehensive multi-turn dialogue benchmark to evaluate LLMs' safety.☆42Updated 9 months ago
- [ICCV23] Robust Mixture-of-Expert Training for Convolutional Neural Networks by Yihua Zhang, Ruisi Cai, Tianlong Chen, Guanhua Zhang, Hua…☆67Updated 2 years ago
- IJCAI Review & MetaReview Monitor☆106Updated 9 months ago
- Code for ICML 2024 paper (Oral) — Test-Time Model Adaptation with Only Forward Passes☆95Updated last year
- ☆105Updated 2 years ago
- ☆12Updated 2 years ago