SpursGoZmy / Table-LLaVA

Dataset and Code for our ACL 2024 paper: "Multimodal Table Understanding". We propose the first large-scale Multimodal IFT and Pre-Train Dataset for table understanding and develop a generalist tabular MLLM named Table-LLaVA.
178Updated 3 months ago

Alternatives and similar repositories for Table-LLaVA:

Users that are interested in Table-LLaVA are comparing it to the libraries listed below