Table Pretraining Versions Save

ICLR 2022 Paper, SOTA Table Pre-training Model, TAPEX: Table Pre-training via Learning a Neural SQL Executor

poet-data

1 year ago

origin-data

2 years ago

preprocessed-data

2 years ago

fine-tuned-model

2 years ago

pretraining-corpus

2 years ago

The largest sythetic corpus (500M pairs of SQL-Table-Answers) used by TAPEX during pre-training.

pretrained-model

2 years ago