Advances in Tabular Foundation Models

The field of tabular foundation models is rapidly evolving, with a focus on improving the efficiency and effectiveness of these models. One of the key trends is the development of new attention mechanisms, such as bi-axial attention, which enables the capture of both local and global dependencies in tabular data. Another area of research is the integration of tabular data with large language models, with a focus on treating tables as an independent modality. Additionally, there is a growing interest in robustness and adaptability of tabular foundation models, with techniques such as adversarial training and feature-aware modulation being explored. Notable papers in this area include: Orion-Bix, which introduces a tabular foundation model that combines bi-axial attention with meta-learned in-context reasoning, and TAMO, which proposes a multimodal framework for integrating tables with large language models. Robust Tabular Foundation Models also presents a promising new direction for targeted adversarial training and fine-tuning of TFMs using synthetic data alone.

Sources

Orion-Bix: Bi-Axial Attention for Tabular In-Context Learning

Light-Weight Benchmarks Reveal the Hidden Hardware Cost of Zero-Shot Tabular Foundation Models

Table as a Modality for Large Language Models

Robust Tabular Foundation Models

Feature-aware Modulation for Learning from Temporal Tabular Data

Built with on top of