SpursGoZmy / Table-LLaVALinks

Dataset and Code for our ACL 2024 paper: "Multimodal Table Understanding". We propose the first large-scale Multimodal IFT and Pre-Train Dataset for table understanding and develop a generalist tabular MLLM named Table-LLaVA.
201Updated last month

Alternatives and similar repositories for Table-LLaVA

Users that are interested in Table-LLaVA are comparing it to the libraries listed below

Sorting: