SpursGoZmy / Table-LLaVAView on GitHub
Dataset and Code for our ACL 2024 paper: "Multimodal Table Understanding". We propose the first large-scale Multimodal IFT and Pre-Train Dataset for table understanding and develop a generalist tabular MLLM named Table-LLaVA.
225Jun 12, 2025Updated 8 months ago

Alternatives and similar repositories for Table-LLaVA

Users that are interested in Table-LLaVA are comparing it to the libraries listed below

Sorting:

Are these results useful?